Commit 5f859ebb by Matthias Putz

Update: google git-repo v1.12.37

parent 451d58f3
[flake8]
max-line-length=80
ignore=E111,E114,E402
# Prevent /bin/sh scripts from being clobbered by autocrlf=true
git_ssh text eol=lf
main.py text eol=lf
repo text eol=lf
hooks/* text eol=lf
Anthony Newnam <anthony.newnam@garmin.com> Anthony <anthony@bnovc.com>
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu xiuyun <xiuyun.hu@hisilicon.com>
Hu Xiuyun <xiuyun.hu@hisilicon.com> Hu Xiuyun <clouds08@qq.com>
Jelly Chen <chenguodong@huawei.com> chenguodong <chenguodong@huawei.com>
Jia Bi <bijia@xiaomi.com> bijia <bijia@xiaomi.com>
JoonCheol Park <jooncheol@gmail.com> Jooncheol Park <jooncheol@gmail.com>
Sergii Pylypenko <x.pelya.x@gmail.com> pelya <x.pelya.x@gmail.com>
Shawn Pearce <sop@google.com> Shawn O. Pearce <sop@google.com>
Ulrik Sjölin <ulrik.sjolin@sonyericsson.com> Ulrik Sjolin <ulrik.sjolin@gmail.com>
Ulrik Sjölin <ulrik.sjolin@sonyericsson.com> Ulrik Sjolin <ulrik.sjolin@sonyericsson.com>
Ulrik Sjölin <ulrik.sjolin@sonyericsson.com> Ulrik Sjölin <ulrik.sjolin@sonyericsson.com>
# Short Version
- Make small logical changes.
- Provide a meaningful commit message.
- Check for coding errors and style nits with pyflakes and flake8
- Make sure all code is under the Apache License, 2.0.
- Publish your changes for review.
- Make corrections if requested.
- Verify your changes on gerrit so they can be submitted.
`git push https://gerrit-review.googlesource.com/git-repo HEAD:refs/for/master`
# Long Version
I wanted a file describing how to submit patches for repo,
so I started with the one found in the core Git distribution
(Documentation/SubmittingPatches), which itself was based on the
patch submission guidelines for the Linux kernel.
However there are some differences, so please review and familiarize
yourself with the following relevant bits.
## Make separate commits for logically separate changes.
Unless your patch is really trivial, you should not be sending
out a patch that was generated between your working tree and your
commit head. Instead, always make a commit with complete commit
message and generate a series of patches from your repository.
It is a good discipline.
Describe the technical detail of the change(s).
If your description starts to get too long, that's a sign that you
probably need to split up your commit to finer grained pieces.
## Check for coding errors and style nits with pyflakes and flake8
### Coding errors
Run `pyflakes` on changed modules:
pyflakes file.py
Ideally there should be no new errors or warnings introduced.
### Style violations
Run `flake8` on changes modules:
flake8 file.py
Note that repo generally follows [Google's python style guide]
(https://google.github.io/styleguide/pyguide.html) rather than [PEP 8]
(https://www.python.org/dev/peps/pep-0008/), so it's possible that
the output of `flake8` will be quite noisy. It's not mandatory to
avoid all warnings, but at least the maximum line length should be
followed.
If there are many occurrences of the same warning that cannot be
avoided without going against the Google style guide, these may be
suppressed in the included `.flake8` file.
## Check the license
repo is licensed under the Apache License, 2.0.
Because of this licensing model *every* file within the project
*must* list the license that covers it in the header of the file.
Any new contributions to an existing file *must* be submitted under
the current license of that file. Any new files *must* clearly
indicate which license they are provided under in the file header.
Please verify that you are legally allowed and willing to submit your
changes under the license covering each file *prior* to submitting
your patch. It is virtually impossible to remove a patch once it
has been applied and pushed out.
## Sending your patches.
Do not email your patches to anyone.
Instead, login to the Gerrit Code Review tool at:
https://gerrit-review.googlesource.com/
Ensure you have completed one of the necessary contributor
agreements, providing documentation to the project maintainers that
they have right to redistribute your work under the Apache License:
https://gerrit-review.googlesource.com/#/settings/agreements
Ensure you have obtained an HTTP password to authenticate:
https://gerrit-review.googlesource.com/new-password
Ensure that you have the local commit hook installed to automatically
add a ChangeId to your commits:
curl -Lo `git rev-parse --git-dir`/hooks/commit-msg https://gerrit-review.googlesource.com/tools/hooks/commit-msg
chmod +x `git rev-parse --git-dir`/hooks/commit-msg
If you have already committed your changes you will need to amend the commit
to get the ChangeId added.
git commit --amend
Push your patches over HTTPS to the review server, possibly through
a remembered remote to make this easier in the future:
git config remote.review.url https://gerrit-review.googlesource.com/git-repo
git config remote.review.push HEAD:refs/for/master
git push review
You will be automatically emailed a copy of your commits, and any
comments made by the project maintainers.
## Make changes if requested
The project maintainer who reviews your changes might request changes to your
commit. If you make the requested changes you will need to amend your commit
and push it to the review server again.
## Verify your changes on gerrit
After you receive a Code-Review+2 from the maintainer, select the Verified
button on the gerrit page for the change. This verifies that you have tested
your changes and notifies the maintainer that they are ready to be submitted.
The maintainer will then submit your changes to the repository.
......@@ -31,7 +31,7 @@ class Command(object):
manifest = None
_optparse = None
def WantPager(self, opt):
def WantPager(self, _opt):
return False
def ReadEnvironmentOptions(self, opts):
......@@ -63,7 +63,7 @@ class Command(object):
usage = self.helpUsage.strip().replace('%prog', me)
except AttributeError:
usage = 'repo %s' % self.NAME
self._optparse = optparse.OptionParser(usage = usage)
self._optparse = optparse.OptionParser(usage=usage)
self._Options(self._optparse)
return self._optparse
......@@ -110,15 +110,20 @@ class Command(object):
project = None
if os.path.exists(path):
oldpath = None
while path \
and path != oldpath \
and path != manifest.topdir:
while path and \
path != oldpath and \
path != manifest.topdir:
try:
project = self._by_path[path]
break
except KeyError:
oldpath = path
path = os.path.dirname(path)
if not project and path == manifest.topdir:
try:
project = self._by_path[path]
except KeyError:
pass
else:
try:
project = self._by_path[path]
......@@ -138,7 +143,7 @@ class Command(object):
mp = manifest.manifestProject
if not groups:
groups = mp.config.GetString('manifest.groups')
groups = mp.config.GetString('manifest.groups')
if not groups:
groups = 'default,platform-' + platform.system().lower()
groups = [x for x in re.split(r'[,\s]+', groups) if x]
......@@ -151,8 +156,7 @@ class Command(object):
for p in project.GetDerivedSubprojects())
all_projects_list.extend(derived_projects.values())
for project in all_projects_list:
if ((missing_ok or project.Exists) and
project.MatchesGroups(groups)):
if (missing_ok or project.Exists) and project.MatchesGroups(groups):
result.append(project)
else:
self._ResetPathToProjectMap(all_projects_list)
......@@ -166,8 +170,8 @@ class Command(object):
# If it's not a derived project, update path->project mapping and
# search again, as arg might actually point to a derived subproject.
if (project and not project.Derived and
(submodules_ok or project.sync_s)):
if (project and not project.Derived and (submodules_ok or
project.sync_s)):
search_again = False
for subproject in project.GetDerivedSubprojects():
self._UpdatePathToProjectMap(subproject)
......@@ -194,17 +198,24 @@ class Command(object):
result.sort(key=_getpath)
return result
def FindProjects(self, args):
def FindProjects(self, args, inverse=False):
result = []
patterns = [re.compile(r'%s' % a, re.IGNORECASE) for a in args]
for project in self.GetProjects(''):
for pattern in patterns:
if pattern.search(project.name) or pattern.search(project.relpath):
match = pattern.search(project.name) or pattern.search(project.relpath)
if not inverse and match:
result.append(project)
break
if inverse and match:
break
else:
if inverse:
result.append(project)
result.sort(key=lambda project: project.relpath)
return result
# pylint: disable=W0223
# Pylint warns that the `InteractiveCommand` and `PagedCommand` classes do not
# override method `Execute` which is abstract in `Command`. Since that method
......@@ -214,28 +225,32 @@ class InteractiveCommand(Command):
"""Command which requires user interaction on the tty and
must not run within a pager, even if the user asks to.
"""
def WantPager(self, opt):
def WantPager(self, _opt):
return False
class PagedCommand(Command):
"""Command which defaults to output in a pager, as its
display tends to be larger than one screen full.
"""
def WantPager(self, opt):
def WantPager(self, _opt):
return True
# pylint: enable=W0223
class MirrorSafeCommand(object):
"""Command permits itself to run within a mirror,
and does not require a working directory.
"""
class GitcAvailableCommand(object):
"""Command that requires GITC to be available, but does
not require the local client to be a GITC client.
"""
class GitcClientCommand(object):
"""Command that requires the local client to be a GITC
client.
......
......@@ -35,6 +35,7 @@ following DTD:
<!ATTLIST remote name ID #REQUIRED>
<!ATTLIST remote alias CDATA #IMPLIED>
<!ATTLIST remote fetch CDATA #REQUIRED>
<!ATTLIST remote pushurl CDATA #IMPLIED>
<!ATTLIST remote review CDATA #IMPLIED>
<!ATTLIST remote revision CDATA #IMPLIED>
......@@ -125,6 +126,12 @@ Attribute `fetch`: The Git URL prefix for all projects which use
this remote. Each project's name is appended to this prefix to
form the actual URL used to clone the project.
Attribute `pushurl`: The Git "push" URL prefix for all projects
which use this remote. Each project's name is appended to this
prefix to form the actual URL used to "git push" the project.
This attribute is optional; if not specified then "git push"
will use the same URL as the `fetch` attribute.
Attribute `review`: Hostname of the Gerrit server where reviews
are uploaded to by `repo upload`. This attribute is optional;
if not specified then `repo upload` will not function.
......@@ -175,7 +182,8 @@ The manifest server should implement the following RPC methods:
GetApprovedManifest(branch, target)
Return a manifest in which each project is pegged to a known good revision
for the current branch and target.
for the current branch and target. This is used by repo sync when the
--smart-sync option is given.
The target to use is defined by environment variables TARGET_PRODUCT
and TARGET_BUILD_VARIANT. These variables are used to create a string
......@@ -187,7 +195,8 @@ should choose a reasonable default target.
GetManifest(tag)
Return a manifest in which each project is pegged to the revision at
the specified tag.
the specified tag. This is used by repo sync when the --smart-tag option
is given.
Element project
......
......@@ -464,9 +464,13 @@ def _open_ssh(host, port=None):
% (host,port, str(e)), file=sys.stderr)
return False
time.sleep(1)
ssh_died = (p.poll() is not None)
if ssh_died:
return False
_master_processes.append(p)
_master_keys.add(key)
time.sleep(1)
return True
finally:
_master_keys_lock.release()
......@@ -568,6 +572,7 @@ class Remote(object):
self._config = config
self.name = name
self.url = self._Get('url')
self.pushUrl = self._Get('pushurl')
self.review = self._Get('review')
self.projectname = self._Get('projectname')
self.fetch = list(map(RefSpec.FromString,
......@@ -700,6 +705,10 @@ class Remote(object):
"""Save this remote to the configuration.
"""
self._Set('url', self.url)
if self.pushUrl is not None:
self._Set('pushurl', self.pushUrl + '/' + self.projectname)
else:
self._Set('pushurl', self.pushUrl)
self._Set('review', self.review)
self._Set('projectname', self.projectname)
self._Set('fetch', list(map(str, self.fetch)))
......
......@@ -24,7 +24,9 @@ import git_command
import git_config
import wrapper
NUM_BATCH_RETRIEVE_REVISIONID = 300
from error import ManifestParseError
NUM_BATCH_RETRIEVE_REVISIONID = 32
def get_gitc_manifest_dir():
return wrapper.Wrapper().get_gitc_manifest_dir()
......@@ -54,7 +56,11 @@ def _set_project_revisions(projects):
if gitcmd.Wait():
print('FATAL: Failed to retrieve revisionExpr for %s' % proj)
sys.exit(1)
proj.revisionExpr = gitcmd.stdout.split('\t')[0]
revisionExpr = gitcmd.stdout.split('\t')[0]
if not revisionExpr:
raise(ManifestParseError('Invalid SHA-1 revision project %s (%s)' %
(proj.remote.url, proj.revisionExpr)))
proj.revisionExpr = revisionExpr
def _manifest_groups(manifest):
"""Returns the manifest group string that should be synced
......@@ -127,7 +133,7 @@ def generate_gitc_manifest(gitc_manifest, manifest, paths=None):
repo_proj.revisionExpr = None
# Convert URLs from relative to absolute.
for name, remote in manifest.remotes.iteritems():
for _name, remote in manifest.remotes.iteritems():
remote.fetchUrl = remote.resolvedFetchUrl
# Save the manifest.
......
#!/bin/sh
# From Gerrit Code Review 2.12.1
#
# Part of Gerrit Code Review (http://code.google.com/p/gerrit/)
# Part of Gerrit Code Review (https://www.gerritcodereview.com/)
#
# Copyright (C) 2009 The Android Open Source Project
#
......@@ -19,7 +20,7 @@
unset GREP_OPTIONS
CHANGE_ID_AFTER="Bug|Issue"
CHANGE_ID_AFTER="Bug|Issue|Test"
MSG="$1"
# Check for, and add if missing, a unique Change-Id
......@@ -38,6 +39,12 @@ add_ChangeId() {
return
fi
# Do not add Change-Id to temp commits
if echo "$clean_message" | head -1 | grep -q '^\(fixup\|squash\)!'
then
return
fi
if test "false" = "`git config --bool --get gerrit.createChangeId`"
then
return
......@@ -57,6 +64,10 @@ add_ChangeId() {
AWK=/usr/xpg4/bin/awk
fi
# Get core.commentChar from git config or use default symbol
commentChar=`git config --get core.commentChar`
commentChar=${commentChar:-#}
# How this works:
# - parse the commit message as (textLine+ blankLine*)*
# - assume textLine+ to be a footer until proven otherwise
......@@ -75,8 +86,8 @@ add_ChangeId() {
blankLines = 0
}
# Skip lines starting with "#" without any spaces before it.
/^#/ { next }
# Skip lines starting with commentChar without any spaces before it.
/^'"$commentChar"'/ { next }
# Skip the line starting with the diff command and everything after it,
# up to the end of the file, assuming it is only patch data.
......
......@@ -384,7 +384,7 @@ class _KerberosAuthHandler(urllib.request.BaseHandler):
self.context = None
self.handler_order = urllib.request.BaseHandler.handler_order - 50
def http_error_401(self, req, fp, code, msg, headers):
def http_error_401(self, req, fp, code, msg, headers): # pylint:disable=unused-argument
host = req.get_host()
retry = self.http_error_auth_reqed('www-authenticate', host, req, headers)
return retry
......
......@@ -65,11 +65,13 @@ class _XmlRemote(object):
name,
alias=None,
fetch=None,
pushUrl=None,
manifestUrl=None,
review=None,
revision=None):
self.name = name
self.fetchUrl = fetch
self.pushUrl = pushUrl
self.manifestUrl = manifestUrl
self.remoteAlias = alias
self.reviewUrl = review
......@@ -103,7 +105,11 @@ class _XmlRemote(object):
remoteName = self.name
if self.remoteAlias:
remoteName = self.remoteAlias
return RemoteSpec(remoteName, url, self.reviewUrl)
return RemoteSpec(remoteName,
url=url,
pushUrl=self.pushUrl,
review=self.reviewUrl,
orig_name=self.name)
class XmlManifest(object):
"""manages the repo configuration file"""
......@@ -159,6 +165,8 @@ class XmlManifest(object):
root.appendChild(e)
e.setAttribute('name', r.name)
e.setAttribute('fetch', r.fetchUrl)
if r.pushUrl is not None:
e.setAttribute('pushurl', r.pushUrl)
if r.remoteAlias is not None:
e.setAttribute('alias', r.remoteAlias)
if r.reviewUrl is not None:
......@@ -251,9 +259,9 @@ class XmlManifest(object):
e.setAttribute('path', relpath)
remoteName = None
if d.remote:
remoteName = d.remote.remoteAlias or d.remote.name
if not d.remote or p.remote.name != remoteName:
remoteName = p.remote.name
remoteName = d.remote.name
if not d.remote or p.remote.orig_name != remoteName:
remoteName = p.remote.orig_name
e.setAttribute('remote', remoteName)
if peg_rev:
if self.IsMirror:
......@@ -269,7 +277,7 @@ class XmlManifest(object):
# isn't our value
e.setAttribute('upstream', p.revisionExpr)
else:
revision = self.remotes[remoteName].revision or d.revisionExpr
revision = self.remotes[p.remote.orig_name].revision or d.revisionExpr
if not revision or revision != p.revisionExpr:
e.setAttribute('revision', p.revisionExpr)
if p.upstream and p.upstream != p.revisionExpr:
......@@ -638,6 +646,9 @@ class XmlManifest(object):
if alias == '':
alias = None
fetch = self._reqatt(node, 'fetch')
pushUrl = node.getAttribute('pushurl')
if pushUrl == '':
pushUrl = None
review = node.getAttribute('review')
if review == '':
review = None
......@@ -645,7 +656,7 @@ class XmlManifest(object):
if revision == '':
revision = None
manifestUrl = self.manifestProject.config.GetString('remote.origin.url')
return _XmlRemote(name, alias, fetch, manifestUrl, review, revision)
return _XmlRemote(name, alias, fetch, pushUrl, manifestUrl, review, revision)
def _ParseDefault(self, node):
"""
......@@ -971,5 +982,5 @@ class GitcManifest(XmlManifest):
def _output_manifest_project_extras(self, p, e):
"""Output GITC Specific Project attributes"""
if p.old_revision:
e.setAttribute('old-revision', str(p.old_revision))
e.setAttribute('old-revision', str(p.old_revision))
......@@ -28,6 +28,9 @@ VERSION = (1, 25)
# increment this if the MAINTAINER_KEYS block is modified
KEYRING_VERSION = (1, 3)
# Each individual key entry is created by using:
# gpg --armor --export keyid
MAINTAINER_KEYS = """
Repo Maintainer (E.S.R.Labs) <repo@esrlabs.com>
......@@ -209,6 +212,9 @@ group.add_option('-p', '--platform',
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
group.add_option('--no-clone-bundle',
dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS')
# Tool
......@@ -352,7 +358,7 @@ def _Init(args, gitc_init=False):
can_verify = True
dst = os.path.abspath(os.path.join(repodir, S_repo))
_Clone(url, dst, opt.quiet)
_Clone(url, dst, opt.quiet, not opt.no_clone_bundle)
if can_verify and not opt.no_repo_verify:
rev = _Verify(dst, branch, opt.quiet)
......@@ -445,7 +451,10 @@ def SetupGnuPG(quiet):
sys.exit(1)
env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode()
try:
env['GNUPGHOME'] = gpg_dir
except UnicodeEncodeError:
env['GNUPGHOME'] = gpg_dir.encode()
cmd = ['gpg', '--import']
try:
......@@ -556,7 +565,7 @@ def _DownloadBundle(url, local, quiet):
try:
r = urllib.request.urlopen(url)
except urllib.error.HTTPError as e:
if e.code in [401, 403, 404, 406]:
if e.code in [401, 403, 404, 406, 501]:
return False
_print('fatal: Cannot get %s' % url, file=sys.stderr)
_print('fatal: HTTP error %s' % e.code, file=sys.stderr)
......@@ -588,7 +597,7 @@ def _ImportBundle(local):
os.remove(path)
def _Clone(url, local, quiet):
def _Clone(url, local, quiet, clone_bundle):
"""Clones a git repository to a new subdirectory of repodir
"""
try:
......@@ -618,7 +627,7 @@ def _Clone(url, local, quiet):
_SetConfig(local,
'remote.origin.fetch',
'+refs/heads/*:refs/remotes/origin/*')
if _DownloadBundle(url, local, quiet):
if clone_bundle and _DownloadBundle(url, local, quiet):
_ImportBundle(local)
_Fetch(url, local, 'origin', quiet)
......@@ -652,7 +661,10 @@ def _Verify(cwd, branch, quiet):
_print(file=sys.stderr)
env = os.environ.copy()
env['GNUPGHOME'] = gpg_dir.encode()
try:
env['GNUPGHOME'] = gpg_dir
except UnicodeEncodeError:
env['GNUPGHOME'] = gpg_dir.encode()
cmd = [GIT, 'tag', '-v', cur]
proc = subprocess.Popen(cmd,
......
......@@ -71,6 +71,10 @@ synced and their revisions won't be found.
p.add_option('--no-color',
dest='color', action='store_false', default=True,
help='does not display the diff in color.')
p.add_option('--pretty-format',
dest='pretty_format', action='store',
metavar='<FORMAT>',
help='print the log using a custom git pretty format string')
def _printRawDiff(self, diff):
for project in diff['added']:
......@@ -92,7 +96,7 @@ synced and their revisions won't be found.
otherProject.revisionExpr))
self.out.nl()
def _printDiff(self, diff, color=True):
def _printDiff(self, diff, color=True, pretty_format=None):
if diff['added']:
self.out.nl()
self.printText('added projects : \n')
......@@ -124,7 +128,8 @@ synced and their revisions won't be found.
self.printText(' to ')
self.printRevision(otherProject.revisionExpr)
self.out.nl()
self._printLogs(project, otherProject, raw=False, color=color)
self._printLogs(project, otherProject, raw=False, color=color,
pretty_format=pretty_format)
self.out.nl()
if diff['unreachable']:
......@@ -139,9 +144,13 @@ synced and their revisions won't be found.
self.printText(' not found')
self.out.nl()
def _printLogs(self, project, otherProject, raw=False, color=True):
logs = project.getAddedAndRemovedLogs(otherProject, oneline=True,
color=color)
def _printLogs(self, project, otherProject, raw=False, color=True,
pretty_format=None):
logs = project.getAddedAndRemovedLogs(otherProject,
oneline=(pretty_format is None),
color=color,
pretty_format=pretty_format)
if logs['removed']:
removedLogs = logs['removed'].split('\n')
for log in removedLogs:
......@@ -192,4 +201,4 @@ synced and their revisions won't be found.
if opt.raw:
self._printRawDiff(diff)
else:
self._printDiff(diff, color=opt.color)
self._printDiff(diff, color=opt.color, pretty_format=opt.pretty_format)
......@@ -121,6 +121,9 @@ without iterating through the remaining projects.
p.add_option('-r', '--regex',
dest='regex', action='store_true',
help="Execute the command only on projects matching regex or wildcard expression")
p.add_option('-i', '--inverse-regex',
dest='inverse_regex', action='store_true',
help="Execute the command only on projects not matching regex or wildcard expression")
p.add_option('-g', '--groups',
dest='groups',
help="Execute the command only on projects matching the specified groups")
......@@ -216,10 +219,12 @@ without iterating through the remaining projects.
if os.path.isfile(smart_sync_manifest_path):
self.manifest.Override(smart_sync_manifest_path)
if not opt.regex:
projects = self.GetProjects(args, groups=opt.groups)
else:
if opt.regex:
projects = self.FindProjects(args)
elif opt.inverse_regex:
projects = self.FindProjects(args, inverse=True)
else:
projects = self.GetProjects(args, groups=opt.groups)
os.environ['REPO_COUNT'] = str(len(projects))
......
......@@ -61,6 +61,11 @@ directory use as much data as possible from the local reference
directory when fetching from the server. This will make the sync
go a lot faster by reducing data traffic on the network.
The --no-clone-bundle option disables any attempt to use
$URL/clone.bundle to bootstrap a new Git repository from a
resumeable bundle file on a content delivery network. This
may be necessary if there are problems with the local Python
HTTP client or proxy configuration, but the Git binary works.
Switching Manifest Branches
---------------------------
......@@ -113,6 +118,9 @@ to update the working directory files.
help='restrict manifest projects to ones with a specified '
'platform group [auto|all|none|linux|darwin|...]',
metavar='PLATFORM')
g.add_option('--no-clone-bundle',
dest='no_clone_bundle', action='store_true',
help='disable use of /clone.bundle on HTTP/HTTPS')
# Tool
g = p.add_option_group('repo Version options')
......@@ -222,7 +230,8 @@ to update the working directory files.
'in another location.', file=sys.stderr)
sys.exit(1)
if not m.Sync_NetworkHalf(is_new=is_new):
if not m.Sync_NetworkHalf(is_new=is_new, quiet=opt.quiet,
clone_bundle=not opt.no_clone_bundle):
r = m.GetRemote(m.remote.name)
print('fatal: cannot obtain manifest %s' % r.url, file=sys.stderr)
......
......@@ -54,8 +54,7 @@ revision specified in the manifest.
if not opt.all:
projects = args[1:]
if len(projects) < 1:
print("error: at least one project must be specified", file=sys.stderr)
sys.exit(1)
projects = ['.',] # start it in the local project by default
all_projects = self.GetProjects(projects,
missing_ok=bool(self.gitc_manifest))
......
......@@ -244,7 +244,7 @@ later is required to fix a server side protocol bug.
if show_smart:
p.add_option('-s', '--smart-sync',
dest='smart_sync', action='store_true',
help='smart sync using manifest from a known good build')
help='smart sync using manifest from the latest known good build')
p.add_option('-t', '--smart-tag',
dest='smart_tag', action='store',
help='smart sync using manifest from a known tag')
......@@ -399,9 +399,12 @@ later is required to fix a server side protocol bug.
return fetched
def _GCProjects(self, projects):
gitdirs = {}
gc_gitdirs = {}
for project in projects:
gitdirs[project.gitdir] = project.bare_git
if len(project.manifest.GetProjectsWithName(project.name)) > 1:
print('Shared project %s found, disabling pruning.' % project.name)
project.bare_git.config('--replace-all', 'gc.pruneExpire', 'never')
gc_gitdirs[project.gitdir] = project.bare_git
has_dash_c = git_require((1, 7, 2))
if multiprocessing and has_dash_c:
......@@ -411,7 +414,7 @@ later is required to fix a server side protocol bug.
jobs = min(self.jobs, cpu_count)
if jobs < 2:
for bare_git in gitdirs.values():
for bare_git in gc_gitdirs.values():
bare_git.gc('--auto')
return
......@@ -433,7 +436,7 @@ later is required to fix a server side protocol bug.
finally:
sem.release()
for bare_git in gitdirs.values():
for bare_git in gc_gitdirs.values():
if err_event.isSet():
break
sem.acquire()
......@@ -456,6 +459,65 @@ later is required to fix a server side protocol bug.
else:
self.manifest._Unload()
def _DeleteProject(self, path):
print('Deleting obsolete path %s' % path, file=sys.stderr)
# Delete the .git directory first, so we're less likely to have a partially
# working git repository around. There shouldn't be any git projects here,
# so rmtree works.
try:
portable.rmtree(os.path.join(path, '.git'))
except OSError:
print('Failed to remove %s' % os.path.join(path, '.git'), file=sys.stderr)
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return -1
# Delete everything under the worktree, except for directories that contain
# another git project
dirs_to_remove = []
failed = False
for root, dirs, files in os.walk(path):
for f in files:
try:
os.remove(os.path.join(root, f))
except OSError:
print('Failed to remove %s' % os.path.join(root, f), file=sys.stderr)
failed = True
dirs[:] = [d for d in dirs
if not os.path.lexists(os.path.join(root, d, '.git'))]
dirs_to_remove += [os.path.join(root, d) for d in dirs
if os.path.join(root, d) not in dirs_to_remove]
for d in reversed(dirs_to_remove):
if portable.os_path_islink(d):
try:
os.remove(d)
except OSError:
print('Failed to remove %s' % os.path.join(root, d), file=sys.stderr)
failed = True
elif len(os.listdir(d)) == 0:
try:
os.rmdir(d)
except OSError:
print('Failed to remove %s' % os.path.join(root, d), file=sys.stderr)
failed = True
continue
if failed:
print('error: Failed to delete obsolete path %s' % path, file=sys.stderr)
print(' remove manually, then run sync again', file=sys.stderr)
return -1
# Try deleting parent dirs if they are empty
project_dir = path
while project_dir != self.manifest.topdir:
if len(os.listdir(project_dir)) == 0:
os.rmdir(project_dir)
else:
break
project_dir = os.path.dirname(project_dir)
return 0
def UpdateProjectList(self):
new_project_paths = []
for project in self.GetProjects(None, missing_ok=True):
......@@ -476,8 +538,8 @@ later is required to fix a server side protocol bug.
continue
if path not in new_project_paths:
# If the path has already been deleted, we don't need to do it
if os.path.exists(self.manifest.topdir + '/' + path):
gitdir = os.path.join(self.manifest.topdir, path, '.git')
gitdir = os.path.join(self.manifest.topdir, path, '.git')
if os.path.exists(gitdir):
project = Project(
manifest = self.manifest,
name = path,
......@@ -496,18 +558,8 @@ later is required to fix a server side protocol bug.
print(' commit changes, then run sync again',
file=sys.stderr)
return -1
else:
print('Deleting obsolete path %s' % project.worktree,
file=sys.stderr)
portable.rmtree(project.worktree)
# Try deleting parent subdirs if they are empty
project_dir = os.path.dirname(project.worktree)
while project_dir != self.manifest.topdir:
try:
os.rmdir(project_dir)
except OSError:
break
project_dir = os.path.dirname(project_dir)
elif self._DeleteProject(project.worktree):
return -1
new_project_paths.sort()
fd = open(file_path, 'w')
......
......@@ -454,9 +454,15 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
if avail:
pending.append((project, avail))
if pending and (not opt.bypass_hooks):
if not pending:
print("no branches ready for upload", file=sys.stderr)
return
if not opt.bypass_hooks:
hook = RepoHook('pre-upload', self.manifest.repo_hooks_project,
self.manifest.topdir, abort_if_user_denies=True)
self.manifest.topdir,
self.manifest.manifestProject.GetRemote('origin').url,
abort_if_user_denies=True)
pending_proj_names = [project.name for (project, avail) in pending]
pending_worktrees = [project.worktree for (project, avail) in pending]
try:
......@@ -472,9 +478,7 @@ Gerrit Code Review: http://code.google.com/p/gerrit/
cc = _SplitEmails(opt.cc)
people = (reviewers, cc)
if not pending:
print("no branches ready for upload", file=sys.stderr)
elif len(pending) == 1 and len(pending[0][1]) == 1:
if len(pending) == 1 and len(pending[0][1]) == 1:
self._SingleBranch(opt, pending[0][1][0], people)
else:
self._MultipleBranches(opt, pending, people)
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment