From: Hans-Christoph Steiner
Date: Wed, 26 Aug 2015 12:44:36 +0000 (+0000)
Subject: Merge branch 'support-vagrant-cachier' into 'master'
X-Git-Tag: 0.5.0~180
X-Git-Url: http://www.chiark.greenend.org.uk/ucgi/~ianmdlvl/git?a=commitdiff_plain;h=99d0c55fe95a354bb1ba468339cec468a9c226b5;hp=57c6deff01156a3674d541d6c6a1f7efd18ee9b7;p=fdroidserver.git
Merge branch 'support-vagrant-cachier' into 'master'
Add optional support for vagrant-cachier plugin
Building the basebox is excruciating for people on slow connections. I'm particularly sensitive to this after living in Central America for awhile :)
This won't affect anyone who hasn't installed the plugin. For those who do, it creates a persistent shared folder for each box (ie. testing23.box) and detects directories to cache between VM builds (apt, gems, pip, chef cache, etc.)
(The only downside is that, for those following server setup does who are not aware what vagrant-cachier does, it might be unexpected that artifacts persist between vagrant destroys.)
See merge request !25
---
diff --git a/.gitignore b/.gitignore
index 277ca280..9bb942bc 100644
--- a/.gitignore
+++ b/.gitignore
@@ -4,6 +4,7 @@
*.pyc
*.class
*.box
+
# files generated by build
build/
dist/
@@ -11,3 +12,7 @@ env/
fdroidserver.egg-info/
pylint.parseable
/.testfiles/
+docs/html/
+
+# files generated by tests
+tests/getsig/tmp/
diff --git a/COPYING b/LICENSE
similarity index 100%
rename from COPYING
rename to LICENSE
diff --git a/MANIFEST.in b/MANIFEST.in
index 29dd42e4..e0936015 100644
--- a/MANIFEST.in
+++ b/MANIFEST.in
@@ -1,10 +1,9 @@
-include README
+include README.md
include COPYING
include fd-commit
include fdroid
include jenkins-build
include makebuildserver
-include updateplugin
include buildserver/config.buildserver.py
include buildserver/fixpaths.sh
include buildserver/cookbooks/android-ndk/recipes/default.rb
@@ -24,11 +23,13 @@ include examples/config.py
include examples/fdroid-icon.png
include examples/makebs.config.py
include examples/opensc-fdroid.cfg
-include fdroidserver/getsig/run.sh
-include fdroidserver/getsig/make.sh
-include fdroidserver/getsig/getsig.java
+include tests/getsig/run.sh
+include tests/getsig/make.sh
+include tests/getsig/getsig.java
include tests/run-tests
+include tests/update.TestCase
include tests/urzip.apk
+include tests/urzip-badsig.apk
include wp-fdroid/AndroidManifest.xml
include wp-fdroid/android-permissions.php
include wp-fdroid/readme.txt
diff --git a/README b/README
deleted file mode 100644
index 6efc5cd8..00000000
--- a/README
+++ /dev/null
@@ -1,30 +0,0 @@
-F-Droid is an installable catalogue of FOSS (Free and Open Source Software)
-applications for the Android platform. The client makes it easy to browse,
-install, and keep track of updates on your device.
-
-The F-Droid server tools provide various scripts and tools that are used to
-maintain the main F-Droid application repository. You can use these same tools
-to create your own additional or alternative repository for publishing, or to
-assist in creating, testing and submitting metadata to the main repository.
-
-For documentation, please see the docs directory.
-
-Alternatively, visit https://f-droid.org/manual/
-
-
-Installing
-----------
-
-The easiest way to install the fdroidserver tools is to use virtualenv and pip
-(if you are Debian/Ubuntu/Mint/etc, you can first try installing using
-`apt-get install fdroidserver`). First, make sure you have virtualenv
-installed, it should be included in your OS's Python distribution or via other
-mechanisms like dnf/yum/pacman/emerge/Fink/MacPorts/Brew. Then here's how to
-install:
-
- git clone https://gitlab.com/fdroid/fdroidserver.git
- cd fdroidserver
- virtualenv env/
- . env/bin/activate
- pip install -e .
- python2 setup.py install
diff --git a/README.md b/README.md
new file mode 100644
index 00000000..4e7933c8
--- /dev/null
+++ b/README.md
@@ -0,0 +1,92 @@
+F-Droid Server
+==============
+
+Server for [F-Droid](https://f-droid.org), the Free Software repository system
+for Android.
+
+The F-Droid server tools provide various scripts and tools that are used to
+maintain the main [F-Droid application repository](https://f-droid.org/repository/browse).
+You can use these same tools to create your own additional or alternative
+repository for publishing, or to assist in creating, testing and submitting
+metadata to the main repository.
+
+For documentation, please see the docs directory.
+
+Alternatively, visit [https://f-droid.org/manual/](https://f-droid.org/manual/).
+
+What is F-Droid?
+----------------
+
+F-Droid is an installable catalogue of FOSS (Free and Open Source Software)
+applications for the Android platform. The client makes it easy to browse,
+install, and keep track of updates on your device.
+
+Installing
+----------
+
+The easiest way to install the `fdroidserver` tools is on Ubuntu, Mint or other
+Ubuntu based distributions, you can install using:
+
+```
+sudo apt-get install fdroidserver
+```
+
+For older Ubuntu releases or to get the latest version, you can get
+`fdroidserver` from the Guardian Project PPA (the signing key
+fingerprint is `6B80 A842 07B3 0AC9 DEE2 35FE F50E ADDD 2234 F563`)
+
+```
+sudo add-apt-repository ppa:guardianproject/ppa
+sudo apt-get update
+sudo apt-get install fdroidserver
+```
+
+On OSX, `fdroidserver` is available from third party package managers,
+like Homebrew, MacPorts, and Fink:
+
+```
+sudo brew install fdroidserver
+```
+
+For Arch-Linux is a package in the AUR available. If you have installed
+`yaourt` or something similiar, you can do:
+
+```
+yaourt -S fdroidserver
+```
+
+For any platform where Python's `easy_install` is an option (e.g. OSX
+or Cygwin, you can use it:
+
+```
+sudo easy_install fdroidserver
+```
+
+Python's `pip` also works:
+
+```
+sudo pip install fdroidserver
+```
+
+The combination of `virtualenv` and `pip` is great for testing out the
+latest versions of `fdroidserver`. Using `pip`, `fdroidserver` can
+even be installed straight from git. First, make sure you have
+installed the python header files, virtualenv and pip. They should be
+included in your OS's default package manager or you can install them
+via other mechanisms like Brew/dnf/pacman/emerge/Fink/MacPorts.
+
+For Debian based distributions:
+
+```
+apt-get install python-dev python-pip python-virtualenv
+```
+Then here's how to install:
+
+```
+git clone https://gitlab.com/fdroid/fdroidserver.git
+cd fdroidserver
+virtualenv env/
+source env/bin/activate
+pip install -e .
+python2 setup.py install
+```
diff --git a/buildserver/config.buildserver.py b/buildserver/config.buildserver.py
index ffc9cca0..fd6277d4 100644
--- a/buildserver/config.buildserver.py
+++ b/buildserver/config.buildserver.py
@@ -1,6 +1,5 @@
sdk_path = "/home/vagrant/android-sdk"
-ndk_path = "/home/vagrant/android-ndk"
-build_tools = "20.0.0"
-ant = "ant"
-mvn3 = "mvn"
-gradle = "gradle"
+ndk_paths = {
+ 'r9b': "/home/vagrant/android-ndk/r9b",
+ 'r10e': "/home/vagrant/android-ndk/r10e",
+}
diff --git a/buildserver/cookbooks/android-ndk/recipes/default.rb b/buildserver/cookbooks/android-ndk/recipes/default.rb
index 460b4fc4..6fe9e11f 100644
--- a/buildserver/cookbooks/android-ndk/recipes/default.rb
+++ b/buildserver/cookbooks/android-ndk/recipes/default.rb
@@ -2,13 +2,20 @@
ndk_loc = node[:settings][:ndk_loc]
user = node[:settings][:user]
-execute "add-android-ndk-path" do
- user user
- command "echo \"export PATH=\\$PATH:#{ndk_loc} #PATH-NDK\" >> /home/#{user}/.bsenv"
- not_if "grep PATH-NDK /home/#{user}/.bsenv"
+script "setup-android-ndk" do
+ timeout 14400
+ interpreter "bash"
+ user node[:settings][:user]
+ cwd "/tmp"
+ code "
+ mkdir #{ndk_loc}
+ "
+ not_if do
+ File.exists?("#{ndk_loc}")
+ end
end
-script "setup-android-ndk" do
+script "setup-android-ndk-r9b" do
timeout 14400
interpreter "bash"
user node[:settings][:user]
@@ -21,10 +28,30 @@ script "setup-android-ndk" do
fi
tar jxvf /vagrant/cache/android-ndk-r9b-linux-x86$SUFFIX.tar.bz2
tar jxvf /vagrant/cache/android-ndk-r9b-linux-x86$SUFFIX-legacy-toolchains.tar.bz2
- mv android-ndk-r9b #{ndk_loc}
+ mv android-ndk-r9b #{ndk_loc}/r9b
"
not_if do
- File.exists?("#{ndk_loc}")
+ File.exists?("#{ndk_loc}/r9b")
+ end
+end
+
+script "setup-android-ndk-r10e" do
+ timeout 14400
+ interpreter "bash"
+ user node[:settings][:user]
+ cwd "/tmp"
+ code "
+ if [ `uname -m` == 'x86_64' ] ; then
+ SUFFIX='_64'
+ else
+ SUFFIX=''
+ fi
+ chmod u+x /vagrant/cache/android-ndk-r10e-linux-x86$SUFFIX.bin
+ /vagrant/cache/android-ndk-r10e-linux-x86$SUFFIX.bin x
+ mv android-ndk-r10e #{ndk_loc}/r10e
+ "
+ not_if do
+ File.exists?("#{ndk_loc}/r10e")
end
end
diff --git a/buildserver/cookbooks/android-sdk/recipes/default.rb b/buildserver/cookbooks/android-sdk/recipes/default.rb
index 7074382c..3331b849 100644
--- a/buildserver/cookbooks/android-sdk/recipes/default.rb
+++ b/buildserver/cookbooks/android-sdk/recipes/default.rb
@@ -8,7 +8,7 @@ script "setup-android-sdk" do
user user
cwd "/tmp"
code "
- tar zxvf /vagrant/cache/android-sdk_r23.0.2-linux.tgz
+ tar zxvf /vagrant/cache/android-sdk_r24.3.4-linux.tgz
mv android-sdk-linux #{sdk_loc}
#{sdk_loc}/tools/android update sdk --no-ui -t platform-tool
#{sdk_loc}/tools/android update sdk --no-ui -t tool
@@ -26,7 +26,7 @@ end
script "add_build_tools" do
interpreter "bash"
user user
- ver = "20.0.0"
+ ver = "23.0.0"
cwd "/tmp"
code "
if [ -f /vagrant/cache/build-tools/#{ver}.tar.gz ] ; then
@@ -66,7 +66,8 @@ end
%w{android-3 android-4 android-5 android-6 android-7 android-8 android-9
android-10 android-11 android-12 android-13 android-14 android-15
- android-16 android-17 android-18 android-19 android-20
+ android-16 android-17 android-18 android-19 android-20 android-21
+ android-22 android-23
extra-android-support extra-android-m2repository}.each do |sdk|
script "add_sdk_#{sdk}" do
diff --git a/buildserver/cookbooks/fdroidbuild-general/recipes/default.rb b/buildserver/cookbooks/fdroidbuild-general/recipes/default.rb
index 15c031e1..0a42dc36 100644
--- a/buildserver/cookbooks/fdroidbuild-general/recipes/default.rb
+++ b/buildserver/cookbooks/fdroidbuild-general/recipes/default.rb
@@ -5,7 +5,7 @@ execute "apt-get-update" do
command "apt-get update"
end
-%w{ant ant-contrib autoconf autopoint bison cmake expect libtool libsaxonb-java libssl1.0.0 libssl-dev maven openjdk-7-jdk javacc python python-magic git-core mercurial subversion bzr git-svn make perlmagick pkg-config zip yasm imagemagick gettext realpath transfig texinfo curl librsvg2-bin xsltproc vorbis-tools swig quilt faketime optipng python-gnupg}.each do |pkg|
+%w{ant ant-contrib autoconf autoconf2.13 automake1.11 autopoint bison bzr cmake curl expect faketime flex gettext git-core git-svn gperf graphviz imagemagick inkscape javacc libarchive-zip-perl librsvg2-bin libsaxonb-java libssl-dev libssl1.0.0 libtool make maven mercurial nasm openjdk-7-jdk optipng pandoc perlmagick pkg-config python python-gnupg python-magic python-setuptools python3-gnupg quilt realpath scons subversion swig texinfo transfig unzip vorbis-tools xsltproc yasm zip}.each do |pkg|
package pkg do
action :install
end
@@ -19,6 +19,11 @@ if node['kernel']['machine'] == "x86_64"
end
end
+easy_install_package "compare-locales" do
+ options "-U"
+ action :install
+end
+
execute "add-bsenv" do
user user
command "echo \". ./.bsenv \" >> /home/#{user}/.bashrc"
diff --git a/buildserver/cookbooks/gradle/recipes/default.rb b/buildserver/cookbooks/gradle/recipes/default.rb
index 06055e24..397b378a 100644
--- a/buildserver/cookbooks/gradle/recipes/default.rb
+++ b/buildserver/cookbooks/gradle/recipes/default.rb
@@ -18,7 +18,7 @@ script "add-gradle-verdir" do
not_if "test -d /opt/gradle/versions"
end
-%w{1.4 1.6 1.7 1.8 1.9 1.10 1.11 1.12}.each do |ver|
+%w{1.4 1.6 1.7 1.8 1.9 1.10 1.11 1.12 2.1 2.2.1 2.3 2.4 2.5 2.6}.each do |ver|
script "install-gradle-#{ver}" do
cwd "/tmp"
interpreter "bash"
diff --git a/buildserver/cookbooks/gradle/recipes/gradle b/buildserver/cookbooks/gradle/recipes/gradle
index 89169b24..3f836312 100755
--- a/buildserver/cookbooks/gradle/recipes/gradle
+++ b/buildserver/cookbooks/gradle/recipes/gradle
@@ -5,11 +5,11 @@ basedir="$(dirname $bindir)"
verdir="${basedir}/versions"
args=("$@")
-v_all=($(cd ${verdir} && ls | sort -rV))
+v_all=($(cd "${verdir}" && ls | sort -rV))
echo "Available gradle versions: ${v_all[@]}"
run_gradle() {
- ${verdir}/${v_found}/bin/gradle "${args[@]}"
+ "${verdir}/${v_found}/bin/gradle" "${args[@]}"
exit $?
}
@@ -23,21 +23,28 @@ contains() {
# key-value pairs of what gradle version each gradle plugin version
# should accept
-d_plugin_k=(0.12 0.11 0.10 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2)
-d_plugin_v=(1.12 1.12 1.12 1.11 1.10 1.9 1.8 1.6 1.6 1.4 1.4)
+d_plugin_k=(1.3 1.2 1.1 1.0 0.14 0.13 0.12 0.11 0.10 0.9 0.8 0.7 0.6 0.5 0.4 0.3 0.2)
+d_plugin_v=(2.4 2.3 2.2.1 2.2.1 2.1 2.1 1.12 1.12 1.12 1.11 1.10 1.9 1.8 1.6 1.6 1.4 1.4)
-for v in ${d_plugin_v}; do
- contains $v "${v_all[*]}" && v_def=$v && break
+# All gradle versions we know about
+plugin_v=(2.6 2.5 2.4 2.3 2.2.1 2.1 1.12 1.11 1.10 1.9 1.8 1.7 1.6 1.4)
+
+# Find the highest version available
+for v in ${plugin_v}; do
+ if contains $v "${v_all[*]}"; then
+ v_def=$v
+ break
+ fi
done
-# Latest takes priority
-for f in ../build.gradle build.gradle; do
+# Earliest takes priority
+for f in build.gradle ../build.gradle; do
[[ -f $f ]] || continue
while read l; do
if [[ -z "$plugin_pver" && $l == *'com.android.tools.build:gradle:'* ]]; then
plugin_pver=$(echo -n "$l" | sed "s/.*com.android.tools.build:gradle:\\([0-9\\.\\+]\\+\\).*/\\1/")
elif [[ -z "$wrapper_ver" && $l == *'gradleVersion'* ]]; then
- wrapper_ver=$(echo -n "$l" | sed "s/.*gradleVersion[ ]*=[ ]*[\"']\\([0-9\\.]\\+\\)[\"'].*/\\1/")
+ wrapper_ver=$(echo -n "$l" | sed "s/.*gradleVersion *=* *[\"']\\([0-9\\.]\\+\\)[\"'].*/\\1/")
fi
done < $f
done
diff --git a/completion/bash-completion b/completion/bash-completion
index 719511cd..f4dc01d8 100644
--- a/completion/bash-completion
+++ b/completion/bash-completion
@@ -84,19 +84,19 @@ __vercode() {
__complete_options() {
case "${cur}" in
--*)
- COMPREPLY=( $( compgen -W "${lopts}" -- $cur ) )
+ COMPREPLY=( $( compgen -W "--help ${lopts}" -- $cur ) )
return 0;;
*)
- COMPREPLY=( $( compgen -W "${opts} ${lopts}" -- $cur ) )
+ COMPREPLY=( $( compgen -W "-h ${opts} --help ${lopts}" -- $cur ) )
return 0;;
esac
}
__complete_build() {
- opts="-h -v -q -l -s -t -f -a -w"
+ opts="-v -q -l -s -t -f -a -w"
- lopts="--help --verbose --quiet --latest --stop --test --server --resetserver
- --on-server --skip-scan --no-tarball --force --all --wiki"
+ lopts="--verbose --quiet --latest --stop --test --server --resetserver
+ --on-server --skip-scan --no-tarball --force --all --wiki --no-refresh"
case "${cur}" in
-*)
__complete_options
@@ -111,8 +111,8 @@ __complete_build() {
}
__complete_install() {
- opts="-h -v -q"
- lopts="--help --verbose --quiet --all"
+ opts="-v -q"
+ lopts="--verbose --quiet --all"
case "${cur}" in
-*)
__complete_options
@@ -127,9 +127,10 @@ __complete_install() {
}
__complete_update() {
- opts="-h -c -v -q -b -i -I -e -w"
- lopts="--help --create-metadata --verbose --quiet --buildreport
- --interactive --icons --editor --wiki --pretty --clean --delete-unknown"
+ opts="-c -v -q -b -i -I -e -w"
+ lopts="--create-metadata --verbose --quiet --buildreport
+ --interactive --icons --editor --wiki --pretty --clean --delete-unknown
+ --nosign"
case "${prev}" in
-e|--editor)
_filedir
@@ -139,8 +140,8 @@ __complete_update() {
}
__complete_publish() {
- opts="-h -v -q"
- lopts="--help --verbose --quiet"
+ opts="-v -q"
+ lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
@@ -155,8 +156,8 @@ __complete_publish() {
}
__complete_checkupdates() {
- opts="-h -v -q"
- lopts="--help --verbose --quiet --auto --autoonly --commit --gplay"
+ opts="-v -q"
+ lopts="--verbose --quiet --auto --autoonly --commit --gplay"
case "${cur}" in
-*)
__complete_options
@@ -168,23 +169,23 @@ __complete_checkupdates() {
}
__complete_import() {
- opts="-h -u -s -r -q"
- lopts="--help --url --subdir --repo --rev --quiet"
+ opts="-u -s -q"
+ lopts="--url --subdir --rev --quiet"
case "${prev}" in
- -u|--url|-r|--repo|-s|--subdir|--rev) return 0;;
+ -u|--url|-s|--subdir|--rev) return 0;;
esac
__complete_options
}
__complete_readmeta() {
- opts="-h -v -q"
- lopts="--help --verbose --quiet"
+ opts="-v -q"
+ lopts="--verbose --quiet"
__complete_options
}
__complete_rewritemeta() {
- opts="-h -v -q"
- lopts="--help --verbose --quiet"
+ opts="-v -q"
+ lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
@@ -196,8 +197,8 @@ __complete_rewritemeta() {
}
__complete_lint() {
- opts="-h -v -q -p"
- lopts="--help --verbose --quiet --pedantic"
+ opts="-v -q"
+ lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
@@ -209,8 +210,8 @@ __complete_lint() {
}
__complete_scanner() {
- opts="-h -v -q"
- lopts="--help --verbose --quiet --nosvn"
+ opts="-v -q"
+ lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
@@ -225,8 +226,8 @@ __complete_scanner() {
}
__complete_verify() {
- opts="-h -v -q -p"
- lopts="--help --verbose --quiet"
+ opts="-v -q -p"
+ lopts="--verbose --quiet"
case "${cur}" in
-*)
__complete_options
@@ -241,20 +242,27 @@ __complete_verify() {
}
__complete_stats() {
- opts="-h -v -q -d"
- lopts="--help --verbose --quiet --download"
+ opts="-v -q -d"
+ lopts="--verbose --quiet --download"
__complete_options
}
__complete_server() {
- opts="-h -i -v -q"
- lopts="--help --identity-file --verbose --quiet update"
+ opts="-i -v -q"
+ lopts="--identity-file --local-copy-dir --sync-from-local-copy-dir
+ --verbose --quiet --no-checksum update"
+ __complete_options
+}
+
+__complete_signindex() {
+ opts="-v -q"
+ lopts="--verbose"
__complete_options
}
__complete_init() {
- opts="-h -v -q -d"
- lopts="--help --verbose --quiet --distinguished-name --keystore
+ opts="-v -q -d"
+ lopts="--verbose --quiet --distinguished-name --keystore
--repo-keyalias --android-home --no-prompt"
__complete_options
}
@@ -263,7 +271,7 @@ _fdroid() {
local cmd cmds
cmd=${COMP_WORDS[1]}
cmds=" build init install update publish checkupdates import \
-readmeta rewritemeta lint scanner verify stats server "
+readmeta rewritemeta lint scanner verify stats server signindex "
for c in $cmds; do eval "_fdroid_${c} () {
local cur prev opts lopts
diff --git a/docs/fdroid.texi b/docs/fdroid.texi
index dacf6cc9..3ac3927c 100644
--- a/docs/fdroid.texi
+++ b/docs/fdroid.texi
@@ -8,7 +8,7 @@
@copying
This manual is for the F-Droid repository server tools.
-Copyright @copyright{} 2010, 2011, 2012, 2013 Ciaran Gultnieks
+Copyright @copyright{} 2010, 2011, 2012, 2013, 2014, 2015 Ciaran Gultnieks
Copyright @copyright{} 2011 Henrik Tunedal, Michael Haas, John Sullivan
@@ -82,6 +82,8 @@ intended usage. At the very least, you'll need:
GNU/Linux
@item
Python 2.x
+To be sure of being able to process all apk files without error, you need
+2.7.7 or later. See @code{http://bugs.python.org/issue14315}.
@item
The Android SDK Tools and Build-tools.
Note that F-Droid does not assume that you have the Android SDK in your
@@ -113,8 +115,7 @@ VirtualBox (debian package virtualbox)
@item
Ruby (debian packages ruby and rubygems)
@item
-Vagrant (unpackaged) Be sure to use 1.3.x because 1.4.x is completely broken
-(at the time of writing, the forthcoming 1.4.3 might work)
+Vagrant (unpackaged, tested on v1.4.3)
@item
vagrant-cachier plugin (unpackaged): `vagrant plugin install vagrant-cachier`
@item
@@ -457,7 +458,7 @@ following them). In fact, you can standardise all the metadata in a single
command, without changing the functional content, by running:
@example
-fdroid rewritemetadata
+fdroid rewritemeta
@end example
The following sections describe the fields recognised within the file.
@@ -471,6 +472,7 @@ The following sections describe the fields recognised within the file.
* Web Site::
* Source Code::
* Issue Tracker::
+* Changelog::
* Donate::
* FlattrID::
* Bitcoin::
@@ -635,6 +637,16 @@ applications have one.
This is converted to (@code{}) in the public index file.
+@node Changelog
+@section Changelog
+
+@cindex Changelog
+
+The URL for the application's changelog. Optional, since not all
+applications have one.
+
+This is converted to (@code{}) in the public index file.
+
@node Donate
@section Donate
@@ -777,11 +789,6 @@ root dir.
Here's an example of a complex git-svn Repo URL:
http://svn.code.sf.net/p/project/code/svn;trunk=trunk;tags=tags;branches=branches
-For a Subversion repo that requires authentication, you can precede the repo
-URL with username:password@ and those parameters will be passed as @option{--username}
-and @option{--password} to the SVN checkout command. (This now works for both
-svn and git-svn)
-
If the Repo Type is @code{srclib}, then you must specify the name of the
according srclib .txt file. For example if @code{scrlibs/FooBar.txt} exist
and you want to use this srclib, then you have to set Repo to
@@ -835,7 +842,9 @@ As for 'prebuild', but runs on the source code BEFORE any other processing
takes place.
You can use $$SDK$$, $$NDK$$ and $$MVN3$$ to substitute the paths to the
-android SDK and NDK directories, and maven 3 executable respectively.
+android SDK and NDK directories, and maven 3 executable respectively. The
+following per-build variables are available likewise: $$VERSION$$,
+$$VERCODE$$ and $$COMMIT$$.
@item oldsdkloc=yes
The sdk location in the repo is in an old format, or the build.xml is
@@ -888,7 +897,7 @@ which architecture or platform the apk is designed to run on.
If specified, the package version code in the AndroidManifest.xml is
replaced with the version code for the build. See also forceversion.
-@item rm=relpath1,relpath2,...
+@item rm=[,,...]
Specifies the relative paths of files or directories to delete before
the build is done. The paths are relative to the base of the build
directory - i.e. the root of the directory structure checked out from
@@ -898,7 +907,7 @@ AndroidManifest.xml.
Multiple files/directories can be specified by separating them with ','.
Directories will be recursively deleted.
-@item extlibs=a,b,...
+@item extlibs=[,,...]
Comma-separated list of external libraries (jar files) from the
@code{build/extlib} library, which will be placed in the @code{libs} directory
of the project.
@@ -949,9 +958,11 @@ the @code{srclib} directory for details of this.
You can use $$SDK$$, $$NDK$$ and $$MVN3$$ to substitute the paths to the
android SDK and NDK directories, and Maven 3 executable respectively e.g.
-for when you need to run @code{android update project} explicitly.
+for when you need to run @code{android update project} explicitly. The
+following per-build variables are available likewise: $$VERSION$$, $$VERCODE$$
+and $$COMMIT$$.
-@item scanignore=path1,path2,...
+@item scanignore=[,,...]
Enables one or more files/paths to be excluded from the scan process.
This should only be used where there is a very good reason, and
probably accompanied by a comment explaining why it is necessary.
@@ -959,7 +970,7 @@ probably accompanied by a comment explaining why it is necessary.
When scanning the source tree for problems, matching files whose relative
paths start with any of the paths given here are ignored.
-@item scandelete=path1,path2,...
+@item scandelete=[,,...]
Similar to scanignore=, but instead of ignoring files under the given paths,
it tells f-droid to delete the matching files directly.
@@ -973,7 +984,9 @@ mvn or gradle will be executed to clean the build environment right before
build= (or the final build) is run.
You can use $$SDK$$, $$NDK$$ and $$MVN3$$ to substitute the paths to the
-android SDK and NDK directories, and Maven 3 executable respectively.
+android SDK and NDK directories, and maven 3 executable respectively. The
+following per-build variables are available likewise: $$VERSION$$,
+$$VERCODE$$ and $$COMMIT$$.
@item buildjni=[yes|no|]
Enables building of native code via the ndk-build script before doing
@@ -991,23 +1004,41 @@ actually not required or used, remove the directory instead (using
isn't used nor built will result in an error saying that native
libraries were expected in the resulting package.
-@item gradle=
-Build with Gradle instead of Ant, specifying what flavour to assemble.
-If is 'yes' or 'main', no flavour will be used. Note
-that this will not work on projects with flavours, since it will build
-all flavours and there will be no 'main' build.
+@item ndk=
+Version of the NDK to use in this build. Defaults to the latest NDK release
+that included legacy toolchains, so as to not break builds that require
+toolchains no longer included in current versions of the NDK.
+
+The buildserver supports r9b with its legacy toolchains and the latest release
+as of writing this document, r10e. You may add support for more versions by
+adding them to 'ndk_paths' in your config file.
+
+@item gradle=[,,...]
+Build with Gradle instead of Ant, specifying what flavours to use. Flavours
+are case sensitive since the path to the output apk is as well.
+
+If only one flavour is given and it is 'yes' or 'main', no flavour will be
+used. Note that for projects with flavours, you must specify at least one
+valid flavour since 'yes' or 'main' will build all of them separately.
@item maven=yes[@@]
Build with Maven instead of Ant. An extra @@ tells f-droid to run Maven
inside that relative subdirectory. Sometimes it is needed to use @@.. so that
builds happen correctly.
-@item preassemble=
-Space-separated list of Gradle tasks to be run before the assemble task
-in a Gradle project build.
+@item preassemble=[,,...]
+List of Gradle tasks to be run before the assemble task in a Gradle project
+build.
-@item antcommand=xxx
-Specify an alternate Ant command (target) instead of the default
+@item gradleprops=[,,...]
+List of Gradle properties to pass via the command line to Gradle. A property
+can be of the form @code{foo} or of the form @code{key=value}.
+
+For example: @code{gradleprops=enableFoo,someSetting=bar} will result in
+@code{gradle -PenableFoo -PsomeSetting=bar}.
+
+@item antcommands=[,,...]
+Specify an alternate set of Ant commands (target) instead of the default
'release'. It can't be given any flags, such as the path to a build.xml.
@item output=path/to/output.apk
@@ -1034,8 +1065,7 @@ Another example, using extra parameters:
This is optional - if present, it contains a comma-separated list of any of
the following values, describing an anti-feature the application has.
-Even though such apps won't be displayed unless a settings box is ticked,
-it is a good idea to mention the reasons for the anti-feature(s) in the
+It is a good idea to mention the reasons for the anti-feature(s) in the
description:
@itemize @bullet
@@ -1055,15 +1085,21 @@ are impossible to replace or that the replacement cannot be connected to
without major changes to the app.
@item
-@samp{NonFreeAdd} - the application promotes non-Free add-ons, such that the
+@samp{NonFreeAdd} - the application promotes non-free add-ons, such that the
app is effectively an advert for other non-free software and such software is
not clearly labelled as such.
@item
-@samp{NonFreeDep} - the application depends on a non-Free application (e.g.
+@samp{NonFreeDep} - the application depends on a non-free application (e.g.
Google Maps) - i.e. it requires it to be installed on the device, but does not
include it.
+@item
+@samp{UpstreamNonFree} - the application is or depends on non-free software.
+This does not mean that non-free software is included with the app: Most
+likely, it has been patched in some way to remove the non-free code. However,
+functionality may be missing.
+
@end itemize
@node Disabled
@@ -1225,6 +1261,10 @@ specify the package name to search for. Useful when apps have a static package
name but change it programmatically in some app flavors, by e.g. appending
".open" or ".free" at the end of the package name.
+You can also use @code{Ignore} to ignore package name searching. This should
+only be used in some specific cases, for example if the app's build.gradle
+file does not contain the package name.
+
@node Update Check Data
@section Update Check Data
@@ -1292,6 +1332,9 @@ which version should be recommended.
This field is normally automatically updated - see Update Check Mode.
+If not set or set to @code{0}, clients will recommend the highest version they
+can, as if the @code{Current Version Code} was infinite.
+
This is converted to (@code{}) in the public index file.
@node No Source Since
@@ -1395,7 +1438,7 @@ applications.
@section Setting up a build server
In addition to the basic setup previously described, you will also need
-a Vagrant-compatible Debian Testing base box called 'testing32' (or testing64
+a Vagrant-compatible Debian Testing base box called 'jessie32' (or jessie64
for a 64-bit VM, if you want it to be much slower, and require more disk
space).
@@ -1405,10 +1448,16 @@ working copies of source trees are moved from the host to the guest, so
for example, having subversion v1.6 on the host and v1.7 on the guest
would fail.
-Unless you're very trusting. you should create one of these for yourself
-from verified standard Debian installation media. However, you could skip
-over the next few paragraphs (and sacrifice some security) by downloading
-@url{https://f-droid.org/testing32.box}.
+@subsection Creating the Debian base box
+
+The output of this step is a minimal Debian VM that has support for remote
+login and provisioning.
+
+Unless you're very trusting, you should create one of these for yourself
+from verified standard Debian installation media. However, by popular
+demand, the @code{makebuildserver} script will automatically download a
+prebuilt image unless instructed otherwise. If you choose to use the
+prebuilt image, you may safely skip the rest of this section.
Documentation for creating a base box can be found at
@url{http://docs.vagrantup.com/v1/docs/base_boxes.html}.
@@ -1432,8 +1481,9 @@ boot, you need to set @code{GRUB_RECORDFAIL_TIMEOUT} to a value other than
-1 in @code{/etc/grub/default} and then run @code{update-grub}.
@end enumerate
+@subsection Creating the F-Droid base box
-With this base box available, you should then create @code{makebs.config.py},
+The next step in the process is to create @code{makebs.config.py},
using @code{./examples/makebs.config.py} as a reference - look at the settings and
documentation there to decide if any need changing to suit your environment.
There is a path for retrieving the base box if it doesn't exist, and an apt
@@ -1461,7 +1511,23 @@ provisioning scripts detect these, they will be used in preference to
running the android tools. For example, if you have
@code{buildserver/addons/cache/platforms/android-19.tar.gz} that will be
used when installing the android-19 platform, instead of re-downloading it
-using @code{android update sdk --no-ui -t android-19}.
+using @code{android update sdk --no-ui -t android-19}. It is possible to
+create the cache files of this additions from a local installation of the
+SDK including these:
+
+@example
+cd /path/to/android-sdk/platforms
+tar czf android-19.tar.gz android-19
+mv android-19.tar.gz /path/to/buildserver/addons/cache/platforms/
+@end example
+
+If you have already built a buildserver it is also possible to get this
+files directly from the buildserver:
+
+@example
+vagrant ssh -- -C 'tar -C ~/android-sdk/platforms czf android-19.tar.gz android-19'
+vagrant ssh -- -C 'cat ~/android-sdk/platforms/android-19.tar.gz' > /path/to/fdroidserver/buildserver/cache/platforms/android19.tar.gz
+@end example
Once it's complete you'll have a new base box called 'buildserver' which is
what's used for the actual builds. You can then build packages as normal,
diff --git a/docs/gendocs.sh b/docs/gendocs.sh
index e4bfc9fd..0adaf3c4 100755
--- a/docs/gendocs.sh
+++ b/docs/gendocs.sh
@@ -1,8 +1,10 @@
+
+
#!/bin/sh -e
# gendocs.sh -- generate a GNU manual in many formats. This script is
# mentioned in maintain.texi. See the help message below for usage details.
-scriptversion=2013-02-03.15
+scriptversion=2014-10-09.23
# Copyright 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011, 2012, 2013
# Free Software Foundation, Inc.
@@ -273,7 +275,7 @@ mkdir -p "$outdir/"
cmd="$SETLANG $MAKEINFO -o $PACKAGE.info $commonarg $infoarg \"$srcfile\""
echo "Generating info... ($cmd)"
eval "$cmd"
-tar czf "$outdir/$PACKAGE.info.tar.gz" $PACKAGE.info*
+tar --create $PACKAGE.info* | gzip --no-name -f -9 --to-stdout > "$outdir/$PACKAGE.info.tar.gz"
ls -l "$outdir/$PACKAGE.info.tar.gz"
info_tgz_size=`calcsize "$outdir/$PACKAGE.info.tar.gz"`
# do not mv the info files, there's no point in having them available
@@ -283,7 +285,7 @@ cmd="$SETLANG $TEXI2DVI $dirargs \"$srcfile\""
printf "\nGenerating dvi... ($cmd)\n"
eval "$cmd"
# compress/finish dvi:
-gzip -f -9 $PACKAGE.dvi
+gzip --no-name -f -9 $PACKAGE.dvi
dvi_gz_size=`calcsize $PACKAGE.dvi.gz`
mv $PACKAGE.dvi.gz "$outdir/"
ls -l "$outdir/$PACKAGE.dvi.gz"
@@ -301,7 +303,7 @@ if $generate_ascii; then
printf "\nGenerating ascii... ($cmd)\n"
eval "$cmd"
ascii_size=`calcsize $PACKAGE.txt`
- gzip -f -9 -c $PACKAGE.txt >"$outdir/$PACKAGE.txt.gz"
+ gzip --no-name -f -9 -c $PACKAGE.txt >"$outdir/$PACKAGE.txt.gz"
ascii_gz_size=`calcsize "$outdir/$PACKAGE.txt.gz"`
mv $PACKAGE.txt "$outdir/"
ls -l "$outdir/$PACKAGE.txt" "$outdir/$PACKAGE.txt.gz"
@@ -317,7 +319,7 @@ html_split()
(
cd ${split_html_dir} || exit 1
ln -sf ${PACKAGE}.html index.html
- tar -czf "$abs_outdir/${PACKAGE}.html_$1.tar.gz" -- *.html
+ tar --create -- *.html | gzip --no-name -f -9 --to-stdout > "$abs_outdir/${PACKAGE}.html_$1.tar.gz"
)
eval html_$1_tgz_size=`calcsize "$outdir/${PACKAGE}.html_$1.tar.gz"`
rm -f "$outdir"/html_$1/*.html
@@ -333,7 +335,7 @@ if test -z "$use_texi2html"; then
rm -rf $PACKAGE.html # in case a directory is left over
eval "$cmd"
html_mono_size=`calcsize $PACKAGE.html`
- gzip -f -9 -c $PACKAGE.html >"$outdir/$PACKAGE.html.gz"
+ gzip --no-name -f -9 -c $PACKAGE.html >"$outdir/$PACKAGE.html.gz"
html_mono_gz_size=`calcsize "$outdir/$PACKAGE.html.gz"`
copy_images "$outdir/" $PACKAGE.html
mv $PACKAGE.html "$outdir/"
@@ -347,7 +349,7 @@ if test -z "$use_texi2html"; then
copy_images $split_html_dir/ $split_html_dir/*.html
(
cd $split_html_dir || exit 1
- tar -czf "$abs_outdir/$PACKAGE.html_$split.tar.gz" -- *
+ tar --create -- * | gzip --no-name -f -9 --to-stdout > "$abs_outdir/$PACKAGE.html_$split.tar.gz"
)
eval \
html_${split}_tgz_size=`calcsize "$outdir/$PACKAGE.html_$split.tar.gz"`
@@ -363,7 +365,7 @@ else # use texi2html:
rm -rf $PACKAGE.html # in case a directory is left over
eval "$cmd"
html_mono_size=`calcsize $PACKAGE.html`
- gzip -f -9 -c $PACKAGE.html >"$outdir/$PACKAGE.html.gz"
+ gzip --no-name -f -9 -c $PACKAGE.html >"$outdir/$PACKAGE.html.gz"
html_mono_gz_size=`calcsize "$outdir/$PACKAGE.html.gz"`
mv $PACKAGE.html "$outdir/"
@@ -377,7 +379,7 @@ d=`dirname $srcfile`
(
cd "$d"
srcfiles=`ls -d *.texinfo *.texi *.txi *.eps $source_extra 2>/dev/null` || true
- tar czfh "$abs_outdir/$PACKAGE.texi.tar.gz" $srcfiles
+ tar --create --dereference $srcfiles | gzip --no-name -f -9 --to-stdout > "$abs_outdir/$PACKAGE.texi.tar.gz"
ls -l "$abs_outdir/$PACKAGE.texi.tar.gz"
)
texi_tgz_size=`calcsize "$outdir/$PACKAGE.texi.tar.gz"`
@@ -388,7 +390,7 @@ if test -n "$docbook"; then
printf "\nGenerating docbook XML... ($cmd)\n"
eval "$cmd"
docbook_xml_size=`calcsize $PACKAGE-db.xml`
- gzip -f -9 -c $PACKAGE-db.xml >"$outdir/$PACKAGE-db.xml.gz"
+ gzip --no-name -f -9 -c $PACKAGE-db.xml >"$outdir/$PACKAGE-db.xml.gz"
docbook_xml_gz_size=`calcsize "$outdir/$PACKAGE-db.xml.gz"`
mv $PACKAGE-db.xml "$outdir/"
@@ -399,7 +401,7 @@ if test -n "$docbook"; then
eval "$cmd"
(
cd ${split_html_db_dir} || exit 1
- tar -czf "$abs_outdir/${PACKAGE}.html_node_db.tar.gz" -- *.html
+ tar --create -- *.html | gzip --no-name -f -9 --to-stdout > "$abs_outdir/${PACKAGE}.html_node_db.tar.gz"
)
html_node_db_tgz_size=`calcsize "$outdir/${PACKAGE}.html_node_db.tar.gz"`
rm -f "$outdir"/html_node_db/*.html
diff --git a/examples/config.py b/examples/config.py
index a2cc50fc..eed07d3c 100644
--- a/examples/config.py
+++ b/examples/config.py
@@ -3,24 +3,28 @@
# Copy this file to config.py, then amend the settings below according to
# your system configuration.
-# Override the path to the Android SDK, $ANDROID_HOME by default
-# sdk_path = "/path/to/android-sdk"
+# Custom path to the Android SDK, defaults to $ANDROID_HOME
+# sdk_path = "/opt/android-sdk"
+
+# Custom paths to various versions of the Android NDK, defaults to 'r10e' set
+# to $ANDROID_NDK. Most users will have the latest at $ANDROID_NDK, which is
+# used by default. If a version is missing or assigned to None, it is assumed
+# not installed.
+# ndk_paths = {
+# 'r9b': "/opt/android-ndk-r9b",
+# 'r10e': "/opt/android-ndk",
+# }
-# Override the path to the Android NDK, $ANDROID_NDK by default
-# ndk_path = "/path/to/android-ndk"
# Build tools version to be used
-build_tools = "20.0.0"
+build_tools = "22.0.1"
-# Command for running Ant
-# ant = "/path/to/ant"
+# Command or path to binary for running Ant
ant = "ant"
-# Command for running maven 3
-# mvn3 = "/path/to/mvn"
+# Command or path to binary for running maven 3
mvn3 = "mvn"
-# Command for running Gradle
-# gradle = "/path/to/gradle"
+# Command or path to binary for running Gradle
gradle = "gradle"
# Set the maximum age (in days) of an index that a client should accept from
@@ -31,10 +35,10 @@ gradle = "gradle"
repo_maxage = 0
repo_url = "https://MyFirstFDroidRepo.org/fdroid/repo"
-repo_name = "My First FDroid Repo Demo"
+repo_name = "My First F-Droid Repo Demo"
repo_icon = "fdroid-icon.png"
repo_description = """
-This is a repository of apps to be used with FDroid. Applications in this
+This is a repository of apps to be used with F-Droid. Applications in this
repository are either official binaries built by the original application
developers, or are binaries built from source by the admin of f-droid.org
using the tools on https://gitlab.com/u/fdroid.
@@ -46,12 +50,24 @@ using the tools on https://gitlab.com/u/fdroid.
# repository, and no need to define the other archive_ values.
archive_older = 3
archive_url = "https://f-droid.org/archive"
-archive_name = "My First FDroid Archive Demo"
+archive_name = "My First F-Droid Archive Demo"
archive_icon = "fdroid-icon.png"
archive_description = """
The repository of older versions of applications from the main demo repository.
"""
+# `fdroid update` will create a link to the current version of a given app.
+# This provides a static path to the current APK. To disable the creation of
+# this link, uncomment this:
+# make_current_version_link = False
+
+# By default, the "current version" link will be based on the "Name" of the
+# app from the metadata. You can change it to use a different field from the
+# metadata here:
+# current_version_name_source = 'id'
+
+# Optionally, override home directory for gpg
+# gpghome = /home/fdroid/somewhere/else/.gnupg
# The ID of a GPG key for making detached signatures for apks. Optional.
# gpgkey = '1DBA2E89'
@@ -61,10 +77,17 @@ The repository of older versions of applications from the main demo repository.
# jarsigner using -alias. (Not needed in an unsigned repository).
# repo_keyalias = "fdroidrepo"
+# Optionally, the public key for the key defined by repo_keyalias above can
+# be specified here. There is no need to do this, as the public key can and
+# will be retrieved from the keystore when needed. However, specifying it
+# manually can allow some processing to take place without access to the
+# keystore.
+# repo_pubkey = "..."
+
# The keystore to use for release keys when building. This needs to be
# somewhere safe and secure, and backed up! The best way to manage these
# sensitive keys is to use a "smartcard" (aka Hardware Security Module). To
-# configure FDroid to use a smartcard, set the keystore file using the keyword
+# configure F-Droid to use a smartcard, set the keystore file using the keyword
# "NONE" (i.e. keystore = "NONE"). That makes Java find the keystore on the
# smartcard based on 'smartcardoptions' below.
# keystore = "~/.local/share/fdroidserver/keystore.jks"
@@ -188,5 +211,5 @@ build_server_always = False
# Only the fields listed here are supported, defaults shown
char_limits = {
'Summary': 50,
- 'Description': 1500
+ 'Description': 1500,
}
diff --git a/examples/makebs.config.py b/examples/makebs.config.py
index f01e94a2..9220fb12 100644
--- a/examples/makebs.config.py
+++ b/examples/makebs.config.py
@@ -3,16 +3,20 @@
# You may want to alter these before running ./makebuildserver
# Name of the base box to use
-basebox = "testing32"
+basebox = "jessie32"
-# Location where raring32.box can be found, if you don't already have
+# Location where testing32.box can be found, if you don't already have
# it. For security reasons, it's recommended that you make your own
# in a secure environment using trusted media (see the manual) but
# you can use this default if you like...
-baseboxurl = "https://f-droid.org/testing32.box"
+baseboxurl = "https://f-droid.org/jessie32.box"
+# The amount of RAM the build server will have
memory = 3584
+# The number of CPUs the build server will have
+cpus = 1
+
# Debian package proxy server - if you have one, e.g. "http://192.168.0.19:8000"
aptproxy = None
diff --git a/fd-commit b/fd-commit
index 54555345..82ca143d 100755
--- a/fd-commit
+++ b/fd-commit
@@ -1,6 +1,6 @@
#!/bin/bash
#
-# fd-commit - part of the FDroid server tools
+# fd-commit - part of the F-Droid server tools
# Commits updates to apps, allowing you to edit the commit messages
#
# Copyright (C) 2013-2014 Daniel MartÃ
@@ -78,7 +78,6 @@ while read line; do
disable=false
while read line; do
case "$line" in
- *'Maintainer Notes:'*) break ;;
'-Build:'*) onlybuild=false ;;
'+Build:'*)
$newbuild && onlybuild=false
diff --git a/fdroid b/fdroid
index ac32d7c1..f97d7473 100755
--- a/fdroid
+++ b/fdroid
@@ -2,7 +2,7 @@
# -*- coding: utf-8 -*-
#
# fdroid.py - part of the FDroid server tools
-# Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com
+# Copyright (C) 2010-2015, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel MartÃ
#
# This program is free software: you can redistribute it and/or modify
@@ -40,7 +40,8 @@ commands = {
"scanner": "Scan the source code of a package",
"stats": "Update the stats of the repo",
"server": "Interact with the repo HTTP server",
- }
+ "signindex": "Sign indexes created using update --nosign",
+}
def print_help():
diff --git a/fdroidserver/build.py b/fdroidserver/build.py
index b38250e4..0c2d67dd 100644
--- a/fdroidserver/build.py
+++ b/fdroidserver/build.py
@@ -35,7 +35,7 @@ import logging
import common
import metadata
-from common import FDroidException, BuildException, VCSException, FDroidPopen, SilentPopen
+from common import FDroidException, BuildException, VCSException, FDroidPopen, SdkToolsPopen
try:
import paramiko
@@ -48,7 +48,7 @@ def get_builder_vm_id():
if os.path.isdir(vd):
# Vagrant 1.2 (and maybe 1.1?) it's a directory tree...
with open(os.path.join(vd, 'machines', 'default',
- 'virtualbox', 'id')) as vf:
+ 'virtualbox', 'id')) as vf:
id = vf.read()
return id
else:
@@ -71,7 +71,7 @@ def got_valid_builder_vm():
return True
# Vagrant 1.2 - the directory can exist, but the id can be missing...
if not os.path.exists(os.path.join(vd, 'machines', 'default',
- 'virtualbox', 'id')):
+ 'virtualbox', 'id')):
return False
return True
@@ -175,7 +175,7 @@ def get_clean_vm(reset=False):
shutil.rmtree('builder')
os.mkdir('builder')
- p = subprocess.Popen('vagrant --version', shell=True,
+ p = subprocess.Popen(['vagrant', '--version'],
stdout=subprocess.PIPE)
vver = p.communicate()[0]
if vver.startswith('Vagrant version 1.2'):
@@ -302,7 +302,7 @@ def build_server(app, thisbuild, vcs, build_dir, output_dir, force):
ftp.put(os.path.join(serverpath, 'common.py'), 'common.py')
ftp.put(os.path.join(serverpath, 'metadata.py'), 'metadata.py')
ftp.put(os.path.join(serverpath, '..', 'buildserver',
- 'config.buildserver.py'), 'config.py')
+ 'config.buildserver.py'), 'config.py')
ftp.chmod('config.py', 0o600)
# Copy over the ID (head commit hash) of the fdroidserver in use...
@@ -348,8 +348,7 @@ def build_server(app, thisbuild, vcs, build_dir, output_dir, force):
if thisbuild['srclibs']:
for lib in thisbuild['srclibs']:
srclibpaths.append(
- common.getsrclib(lib, 'build/srclib', srclibpaths,
- basepath=True, prepare=False))
+ common.getsrclib(lib, 'build/srclib', basepath=True, prepare=False))
# If one was used for the main source, add that too.
basesrclib = vcs.getsrclib()
@@ -428,35 +427,63 @@ def build_server(app, thisbuild, vcs, build_dir, output_dir, force):
def adapt_gradle(build_dir):
+ filename = 'build.gradle'
for root, dirs, files in os.walk(build_dir):
- if 'build.gradle' in files:
- path = os.path.join(root, 'build.gradle')
- logging.debug("Adapting build.gradle at %s" % path)
-
- FDroidPopen(['sed', '-i',
- r's@buildToolsVersion\([ =]*\)["\'][0-9\.]*["\']@buildToolsVersion\1"'
- + config['build_tools'] + '"@g', path])
-
-
-def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver):
+ for filename in files:
+ if not filename.endswith('.gradle'):
+ continue
+ path = os.path.join(root, filename)
+ if not os.path.isfile(path):
+ continue
+ logging.debug("Adapting %s at %s" % (filename, path))
+ common.regsub_file(r"""(\s*)buildToolsVersion([\s=]+)['"].*""",
+ r"""\1buildToolsVersion\2'%s'""" % config['build_tools'],
+ path)
+
+
+def capitalize_intact(string):
+ """Like str.capitalize(), but leave the rest of the string intact without
+ switching it to lowercase."""
+ if len(string) == 0:
+ return string
+ if len(string) == 1:
+ return string.upper()
+ return string[0].upper() + string[1:]
+
+
+def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver, refresh):
"""Do a build locally."""
if thisbuild['buildjni'] and thisbuild['buildjni'] != ['no']:
- if not config['ndk_path']:
- logging.critical("$ANDROID_NDK is not set!")
+ if not thisbuild['ndk_path']:
+ logging.critical("Android NDK version '%s' could not be found!" % thisbuild['ndk'])
+ logging.critical("Configured versions:")
+ for k, v in config['ndk_paths'].iteritems():
+ if k.endswith("_orig"):
+ continue
+ logging.critical(" %s: %s" % (k, v))
sys.exit(3)
- elif not os.path.isdir(config['sdk_path']):
- logging.critical("$ANDROID_NDK points to a non-existing directory!")
+ elif not os.path.isdir(thisbuild['ndk_path']):
+ logging.critical("Android NDK '%s' is not a directory!" % thisbuild['ndk_path'])
sys.exit(3)
+ # Set up environment vars that depend on each build
+ for n in ['ANDROID_NDK', 'NDK', 'ANDROID_NDK_HOME']:
+ common.env[n] = thisbuild['ndk_path']
+
+ common.reset_env_path()
+ # Set up the current NDK to the PATH
+ common.add_to_env_path(thisbuild['ndk_path'])
+
# Prepare the source code...
root_dir, srclibpaths = common.prepare_source(vcs, app, thisbuild,
build_dir, srclib_dir,
- extlib_dir, onserver)
+ extlib_dir, onserver, refresh)
# We need to clean via the build tool in case the binary dirs are
# different from the default ones
p = None
+ gradletasks = []
if thisbuild['type'] == 'maven':
logging.info("Cleaning Maven project...")
cmd = [config['mvn3'], 'clean', '-Dandroid.sdk.path=' + config['sdk_path']]
@@ -472,12 +499,33 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
elif thisbuild['type'] == 'gradle':
logging.info("Cleaning Gradle project...")
- cmd = [config['gradle'], 'clean']
+
+ if thisbuild['preassemble']:
+ gradletasks += thisbuild['preassemble']
+
+ flavours = thisbuild['gradle']
+ if flavours == ['yes']:
+ flavours = []
+
+ flavours_cmd = ''.join([capitalize_intact(f) for f in flavours])
+
+ gradletasks += ['assemble' + flavours_cmd + 'Release']
adapt_gradle(build_dir)
for name, number, libpath in srclibpaths:
adapt_gradle(libpath)
+ cmd = [config['gradle']]
+ if thisbuild['gradleprops']:
+ cmd += ['-P'+kv for kv in thisbuild['gradleprops']]
+
+ for task in gradletasks:
+ parts = task.split(':')
+ parts[-1] = 'clean' + capitalize_intact(parts[-1])
+ cmd += [':'.join(parts)]
+
+ cmd += ['clean']
+
p = FDroidPopen(cmd, cwd=root_dir)
elif thisbuild['type'] == 'kivy':
@@ -501,13 +549,16 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
if 'gradle' in dirs:
shutil.rmtree(os.path.join(root, 'gradle'))
- if not options.skipscan:
+ if options.skipscan:
+ if thisbuild['scandelete']:
+ raise BuildException("Refusing to skip source scan since scandelete is present")
+ else:
# Scan before building...
logging.info("Scanning source for common problems...")
count = common.scan_source(build_dir, root_dir, thisbuild)
if count > 0:
if force:
- logging.warn('Scanner found %d problems:' % count)
+ logging.warn('Scanner found %d problems' % count)
else:
raise BuildException("Can't build due to %d errors while scanning" % count)
@@ -522,29 +573,10 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
tarball.add(build_dir, tarname, exclude=tarexc)
tarball.close()
- if onserver:
- manifest = os.path.join(root_dir, 'AndroidManifest.xml')
- if os.path.exists(manifest):
- homedir = os.path.expanduser('~')
- with open(os.path.join(homedir, 'buildserverid'), 'r') as f:
- buildserverid = f.read()
- with open(os.path.join(homedir, 'fdroidserverid'), 'r') as f:
- fdroidserverid = f.read()
- with open(manifest, 'r') as f:
- manifestcontent = f.read()
- manifestcontent = manifestcontent.replace('',
- '')
- with open(manifest, 'w') as f:
- f.write(manifestcontent)
-
# Run a build command if one is required...
if thisbuild['build']:
logging.info("Running 'build' commands in %s" % root_dir)
- cmd = common.replace_config_vars(thisbuild['build'])
+ cmd = common.replace_config_vars(thisbuild['build'], thisbuild)
# Substitute source library paths into commands...
for name, number, libpath in srclibpaths:
@@ -564,7 +596,7 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
if jni_components == ['yes']:
jni_components = ['']
- cmd = [os.path.join(config['ndk_path'], "ndk-build"), "-j1"]
+ cmd = [os.path.join(thisbuild['ndk_path'], "ndk-build"), "-j1"]
for d in jni_components:
if d:
logging.info("Building native code in '%s'" % d)
@@ -601,17 +633,13 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
'package']
if thisbuild['target']:
target = thisbuild["target"].split('-')[1]
- FDroidPopen(['sed', '-i',
- 's@[0-9]*@'
- + target + '@g',
- 'pom.xml'],
- cwd=root_dir)
+ common.regsub_file(r'[0-9]*',
+ r'%s' % target,
+ os.path.join(root_dir, 'pom.xml'))
if '@' in thisbuild['maven']:
- FDroidPopen(['sed', '-i',
- 's@[0-9]*@'
- + target + '@g',
- 'pom.xml'],
- cwd=maven_dir)
+ common.regsub_file(r'[0-9]*',
+ r'%s' % target,
+ os.path.join(maven_dir, 'pom.xml'))
p = FDroidPopen(mvncmd, cwd=maven_dir)
@@ -637,14 +665,14 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
modules = bconfig.get('app', 'requirements').split(',')
cmd = 'ANDROIDSDK=' + config['sdk_path']
- cmd += ' ANDROIDNDK=' + config['ndk_path']
- cmd += ' ANDROIDNDKVER=r9'
+ cmd += ' ANDROIDNDK=' + thisbuild['ndk_path']
+ cmd += ' ANDROIDNDKVER=' + thisbuild['ndk']
cmd += ' ANDROIDAPI=' + str(bconfig.get('app', 'android.api'))
cmd += ' VIRTUALENV=virtualenv'
cmd += ' ./distribute.sh'
cmd += ' -m ' + "'" + ' '.join(modules) + "'"
cmd += ' -d fdroid'
- p = FDroidPopen(cmd, cwd='python-for-android', shell=True)
+ p = subprocess.Popen(cmd, cwd='python-for-android', shell=True)
if p.returncode != 0:
raise BuildException("Distribute build failed")
@@ -680,33 +708,25 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
elif thisbuild['type'] == 'gradle':
logging.info("Building Gradle project...")
- flavours = thisbuild['gradle'].split(',')
-
- if len(flavours) == 1 and flavours[0] in ['main', 'yes', '']:
- flavours[0] = ''
-
- commands = [config['gradle']]
- if thisbuild['preassemble']:
- commands += thisbuild['preassemble'].split()
-
- flavours_cmd = ''.join(flavours)
- if flavours_cmd:
- flavours_cmd = flavours_cmd[0].upper() + flavours_cmd[1:]
-
- commands += ['assemble' + flavours_cmd + 'Release']
# Avoid having to use lintOptions.abortOnError false
if thisbuild['gradlepluginver'] >= LooseVersion('0.7'):
with open(os.path.join(root_dir, 'build.gradle'), "a") as f:
f.write("\nandroid { lintOptions { checkReleaseBuilds false } }\n")
- p = FDroidPopen(commands, cwd=root_dir)
+ cmd = [config['gradle']]
+ if thisbuild['gradleprops']:
+ cmd += ['-P'+kv for kv in thisbuild['gradleprops']]
+
+ cmd += gradletasks
+
+ p = FDroidPopen(cmd, cwd=root_dir)
elif thisbuild['type'] == 'ant':
logging.info("Building Ant project...")
cmd = ['ant']
- if thisbuild['antcommand']:
- cmd += [thisbuild['antcommand']]
+ if thisbuild['antcommands']:
+ cmd += thisbuild['antcommands']
else:
cmd += ['release']
p = FDroidPopen(cmd, cwd=root_dir)
@@ -774,7 +794,7 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
if not os.path.exists(src):
raise BuildException("Unsigned apk is not at expected location of " + src)
- p = SilentPopen([config['aapt'], 'dump', 'badging', src])
+ p = SdkToolsPopen(['aapt', 'dump', 'badging', src], output=False)
vercode = None
version = None
@@ -833,6 +853,19 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
str(thisbuild['vercode']))
)
+ # Add information for 'fdroid verify' to be able to reproduce the build
+ # environment.
+ if onserver:
+ metadir = os.path.join(tmp_dir, 'META-INF')
+ if not os.path.exists(metadir):
+ os.mkdir(metadir)
+ homedir = os.path.expanduser('~')
+ for fn in ['buildserverid', 'fdroidserverid']:
+ shutil.copyfile(os.path.join(homedir, fn),
+ os.path.join(metadir, fn))
+ subprocess.call(['jar', 'uf', os.path.abspath(src),
+ 'META-INF/' + fn], cwd=tmp_dir)
+
# Copy the unsigned apk to our destination directory for further
# processing (by publish.py)...
dest = os.path.join(output_dir, common.getapkname(app, thisbuild))
@@ -845,7 +878,7 @@ def build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_d
def trybuild(app, thisbuild, build_dir, output_dir, also_check_dir, srclib_dir, extlib_dir,
- tmp_dir, repo_dir, vcs, test, server, force, onserver):
+ tmp_dir, repo_dir, vcs, test, server, force, onserver, refresh):
"""
Build a particular version of an application, if it needs building.
@@ -890,7 +923,7 @@ def trybuild(app, thisbuild, build_dir, output_dir, also_check_dir, srclib_dir,
build_server(app, thisbuild, vcs, build_dir, output_dir, force)
else:
- build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver)
+ build_local(app, thisbuild, vcs, build_dir, output_dir, srclib_dir, extlib_dir, tmp_dir, force, onserver, refresh)
return True
@@ -918,6 +951,8 @@ def parse_commandline():
help="Skip scanning the source code for binaries and other problems")
parser.add_option("--no-tarball", dest="notarball", action="store_true", default=False,
help="Don't create a source tarball, useful when testing a build")
+ parser.add_option("--no-refresh", dest="refresh", action="store_false", default=True,
+ help="Don't refresh the repository, useful when testing a build with no internet connection")
parser.add_option("-f", "--force", action="store_true", default=False,
help="Force build of disabled apps, and carries on regardless of scan problems. Only allowed in test mode.")
parser.add_option("-a", "--all", action="store_true", default=False,
@@ -1043,7 +1078,22 @@ def main():
also_check_dir, srclib_dir, extlib_dir,
tmp_dir, repo_dir, vcs, options.test,
options.server, options.force,
- options.onserver):
+ options.onserver, options.refresh):
+
+ if app.get('Binaries', None):
+ # This is an app where we build from source, and
+ # verify the apk contents against a developer's
+ # binary. We get that binary now, and save it
+ # alongside our built one in the 'unsigend'
+ # directory.
+ url = app['Binaries']
+ url = url.replace('%v', thisbuild['version'])
+ url = url.replace('%c', str(thisbuild['vercode']))
+ logging.info("...retrieving " + url)
+ of = "{0}_{1}.apk.binary".format(app['id'], thisbuild['vercode'])
+ of = os.path.join(output_dir, of)
+ common.download_file(url, local_filename=of)
+
build_succeeded.append(app)
wikilog = "Build succeeded"
except BuildException as be:
diff --git a/fdroidserver/checkupdates.py b/fdroidserver/checkupdates.py
index 4c1b5728..494a1834 100644
--- a/fdroidserver/checkupdates.py
+++ b/fdroidserver/checkupdates.py
@@ -2,7 +2,7 @@
# -*- coding: utf-8 -*-
#
# checkupdates.py - part of the FDroid server tools
-# Copyright (C) 2010-13, Ciaran Gultnieks, ciaran@ciarang.com
+# Copyright (C) 2010-2015, Ciaran Gultnieks, ciaran@ciarang.com
# Copyright (C) 2013-2014 Daniel MartÃ
#
# This program is free software: you can redistribute it and/or modify
@@ -80,17 +80,25 @@ def check_http(app):
return (None, msg)
+def app_matches_packagename(app, package):
+ if not package:
+ return False
+ appid = app['Update Check Name'] or app['id']
+ if appid == "Ignore":
+ return True
+ return appid == package
+
+
# Check for a new version by looking at the tags in the source repo.
# Whether this can be used reliably or not depends on
# the development procedures used by the project's developers. Use it with
# caution, because it's inappropriate for many projects.
-# Returns (None, "a message") if this didn't work, or (version, vercode) for
+# Returns (None, "a message") if this didn't work, or (version, vercode, tag) for
# the details of the current version.
def check_tags(app, pattern):
try:
- appid = app['Update Check Name'] or app['id']
if app['Repo Type'] == 'srclib':
build_dir = os.path.join('build', 'srclib', app['Repo'])
repotype = common.getsrclibvcs(app['Repo'])
@@ -109,14 +117,12 @@ def check_tags(app, pattern):
vcs.gotorevision(None)
- flavour = None
+ flavours = []
if len(app['builds']) > 0:
if app['builds'][-1]['subdir']:
build_dir = os.path.join(build_dir, app['builds'][-1]['subdir'])
if app['builds'][-1]['gradle']:
- flavour = app['builds'][-1]['gradle']
- if flavour == 'yes':
- flavour = None
+ flavours = app['builds'][-1]['gradle']
hpak = None
htag = None
@@ -124,22 +130,25 @@ def check_tags(app, pattern):
hcode = "0"
tags = vcs.gettags()
+ logging.debug("All tags: " + ','.join(tags))
if pattern:
pat = re.compile(pattern)
tags = [tag for tag in tags if pat.match(tag)]
+ logging.debug("Matching tags: " + ','.join(tags))
if repotype in ('git',):
tags = vcs.latesttags(tags, 5)
+ logging.debug("Latest tags: " + ','.join(tags))
for tag in tags:
logging.debug("Check tag: '{0}'".format(tag))
vcs.gotorevision(tag)
# Only process tags where the manifest exists...
- paths = common.manifest_paths(build_dir, flavour)
+ paths = common.manifest_paths(build_dir, flavours)
version, vercode, package = \
common.parse_androidmanifests(paths, app['Update Check Ignore'])
- if not package or package != appid or not version or not vercode:
+ if not app_matches_packagename(app, package) or not version or not vercode:
continue
logging.debug("Manifest exists. Found version {0} ({1})"
@@ -174,7 +183,6 @@ def check_repomanifest(app, branch=None):
try:
- appid = app['Update Check Name'] or app['id']
if app['Repo Type'] == 'srclib':
build_dir = os.path.join('build', 'srclib', app['Repo'])
repotype = common.getsrclibvcs(app['Repo'])
@@ -196,27 +204,24 @@ def check_repomanifest(app, branch=None):
elif repotype == 'bzr':
vcs.gotorevision(None)
- flavour = None
-
+ flavours = []
if len(app['builds']) > 0:
if app['builds'][-1]['subdir']:
build_dir = os.path.join(build_dir, app['builds'][-1]['subdir'])
if app['builds'][-1]['gradle']:
- flavour = app['builds'][-1]['gradle']
- if flavour == 'yes':
- flavour = None
+ flavours = app['builds'][-1]['gradle']
if not os.path.isdir(build_dir):
return (None, "Subdir '" + app['builds'][-1]['subdir'] + "'is not a valid directory")
- paths = common.manifest_paths(build_dir, flavour)
+ paths = common.manifest_paths(build_dir, flavours)
version, vercode, package = \
common.parse_androidmanifests(paths, app['Update Check Ignore'])
if not package:
return (None, "Couldn't find package ID")
- if package != appid:
- return (None, "Package ID mismatch")
+ if not app_matches_packagename(app, package):
+ return (None, "Package ID mismatch - got {0}".format(package))
if not version:
return (None, "Couldn't find latest version name")
if not vercode:
@@ -310,7 +315,6 @@ def dirs_with_manifest(startdir):
# subdir relative to the build dir if found, None otherwise.
def check_changed_subdir(app):
- appid = app['Update Check Name'] or app['id']
if app['Repo Type'] == 'srclib':
build_dir = os.path.join('build', 'srclib', app['Repo'])
else:
@@ -319,17 +323,15 @@ def check_changed_subdir(app):
if not os.path.isdir(build_dir):
return None
- flavour = None
+ flavours = []
if len(app['builds']) > 0 and app['builds'][-1]['gradle']:
- flavour = app['builds'][-1]['gradle']
- if flavour == 'yes':
- flavour = None
+ flavours = app['builds'][-1]['gradle']
for d in dirs_with_manifest(build_dir):
logging.debug("Trying possible dir %s." % d)
- m_paths = common.manifest_paths(d, flavour)
+ m_paths = common.manifest_paths(d, flavours)
package = common.parse_androidmanifests(m_paths, app['Update Check Ignore'])[2]
- if package and package == appid:
+ if app_matches_packagename(app, package):
logging.debug("Manifest exists in possible dir %s." % d)
return os.path.relpath(d, build_dir)
@@ -352,18 +354,15 @@ def fetch_autoname(app, tag):
except VCSException:
return None
- flavour = None
+ flavours = []
if len(app['builds']) > 0:
if app['builds'][-1]['subdir']:
app_dir = os.path.join(app_dir, app['builds'][-1]['subdir'])
if app['builds'][-1]['gradle']:
- flavour = app['builds'][-1]['gradle']
- if flavour == 'yes':
- flavour = None
+ flavours = app['builds'][-1]['gradle']
- logging.debug("...fetch auto name from " + app_dir +
- ((" (flavour: %s)" % flavour) if flavour else ""))
- new_name = common.fetch_real_name(app_dir, flavour)
+ logging.debug("...fetch auto name from " + app_dir)
+ new_name = common.fetch_real_name(app_dir, flavours)
commitmsg = None
if new_name:
logging.debug("...got autoname '" + new_name + "'")
@@ -374,13 +373,6 @@ def fetch_autoname(app, tag):
else:
logging.debug("...couldn't get autoname")
- if app['Current Version'].startswith('@string/'):
- cv = common.version_name(app['Current Version'], app_dir, flavour)
- if app['Current Version'] != cv:
- app['Current Version'] = cv
- if not commitmsg:
- commitmsg = "Fix CV of {0}".format(common.getappname(app))
-
return commitmsg
diff --git a/fdroidserver/common.py b/fdroidserver/common.py
index 1d262a25..3e085624 100644
--- a/fdroidserver/common.py
+++ b/fdroidserver/common.py
@@ -22,57 +22,110 @@ import sys
import re
import shutil
import glob
+import requests
import stat
import subprocess
import time
import operator
import Queue
import threading
-import magic
import logging
+import hashlib
+import socket
+import xml.etree.ElementTree as XMLElementTree
+
from distutils.version import LooseVersion
+from zipfile import ZipFile
import metadata
+XMLElementTree.register_namespace('android', 'http://schemas.android.com/apk/res/android')
+
config = None
options = None
env = None
+orig_path = None
+
+
+default_config = {
+ 'sdk_path': "$ANDROID_HOME",
+ 'ndk_paths': {
+ 'r9b': None,
+ 'r10e': "$ANDROID_NDK"
+ },
+ 'build_tools': "23.0.0",
+ 'ant': "ant",
+ 'mvn3': "mvn",
+ 'gradle': 'gradle',
+ 'sync_from_local_copy_dir': False,
+ 'make_current_version_link': True,
+ 'current_version_name_source': 'Name',
+ 'update_stats': False,
+ 'stats_ignore': [],
+ 'stats_server': None,
+ 'stats_user': None,
+ 'stats_to_carbon': False,
+ 'repo_maxage': 0,
+ 'build_server_always': False,
+ 'keystore': 'keystore.jks',
+ 'smartcardoptions': [],
+ 'char_limits': {
+ 'Summary': 80,
+ 'Description': 4000
+ },
+ 'keyaliases': {},
+ 'repo_url': "https://MyFirstFDroidRepo.org/fdroid/repo",
+ 'repo_name': "My First FDroid Repo Demo",
+ 'repo_icon': "fdroid-icon.png",
+ 'repo_description': '''
+ This is a repository of apps to be used with FDroid. Applications in this
+ repository are either official binaries built by the original application
+ developers, or are binaries built from source by the admin of f-droid.org
+ using the tools on https://gitlab.com/u/fdroid.
+ ''',
+ 'archive_older': 0,
+}
+
+
+def fill_config_defaults(thisconfig):
+ for k, v in default_config.items():
+ if k not in thisconfig:
+ thisconfig[k] = v
+
+ # Expand paths (~users and $vars)
+ def expand_path(path):
+ if path is None:
+ return None
+ orig = path
+ path = os.path.expanduser(path)
+ path = os.path.expandvars(path)
+ if orig == path:
+ return None
+ return path
+
+ for k in ['sdk_path', 'ant', 'mvn3', 'gradle', 'keystore', 'repo_icon']:
+ v = thisconfig[k]
+ exp = expand_path(v)
+ if exp is not None:
+ thisconfig[k] = exp
+ thisconfig[k + '_orig'] = v
+ for k in ['ndk_paths']:
+ d = thisconfig[k]
+ for k2 in d.copy():
+ v = d[k2]
+ exp = expand_path(v)
+ if exp is not None:
+ thisconfig[k][k2] = exp
+ thisconfig[k][k2 + '_orig'] = v
-def get_default_config():
- return {
- 'sdk_path': os.getenv("ANDROID_HOME") or "",
- 'ndk_path': os.getenv("ANDROID_NDK") or "",
- 'build_tools': "20.0.0",
- 'ant': "ant",
- 'mvn3': "mvn",
- 'gradle': 'gradle',
- 'sync_from_local_copy_dir': False,
- 'update_stats': False,
- 'stats_ignore': [],
- 'stats_server': None,
- 'stats_user': None,
- 'stats_to_carbon': False,
- 'repo_maxage': 0,
- 'build_server_always': False,
- 'keystore': os.path.join(os.getenv("HOME"), '.local', 'share', 'fdroidserver', 'keystore.jks'),
- 'smartcardoptions': [],
- 'char_limits': {
- 'Summary': 50,
- 'Description': 1500
- },
- 'keyaliases': {},
- 'repo_url': "https://MyFirstFDroidRepo.org/fdroid/repo",
- 'repo_name': "My First FDroid Repo Demo",
- 'repo_icon': "fdroid-icon.png",
- 'repo_description': '''
- This is a repository of apps to be used with FDroid. Applications in this
- repository are either official binaries built by the original application
- developers, or are binaries built from source by the admin of f-droid.org
- using the tools on https://gitlab.com/u/fdroid.
- ''',
- 'archive_older': 0,
- }
+
+def regsub_file(pattern, repl, path):
+ with open(path, 'r') as f:
+ text = f.read()
+ text = re.sub(pattern, repl, text)
+ with open(path, 'w') as f:
+ f.write(text)
def read_config(opts, config_file='config.py'):
@@ -81,7 +134,7 @@ def read_config(opts, config_file='config.py'):
The config is read from config_file, which is in the current directory when
any of the repo management commands are used.
"""
- global config, options, env
+ global config, options, env, orig_path
if config is not None:
return config
@@ -111,57 +164,14 @@ def read_config(opts, config_file='config.py'):
if st.st_mode & stat.S_IRWXG or st.st_mode & stat.S_IRWXO:
logging.warn("unsafe permissions on {0} (should be 0600)!".format(config_file))
- defconfig = get_default_config()
- for k, v in defconfig.items():
- if k not in config:
- config[k] = v
-
- # Expand environment variables
- for k, v in config.items():
- if type(v) != str:
- continue
- v = os.path.expanduser(v)
- config[k] = os.path.expandvars(v)
-
- if not test_sdk_exists(config):
- sys.exit(3)
-
- if not test_build_tools_exists(config):
- sys.exit(3)
-
- bin_paths = {
- 'aapt': [
- os.path.join(config['sdk_path'], 'build-tools', config['build_tools'], 'aapt'),
- ],
- 'zipalign': [
- os.path.join(config['sdk_path'], 'tools', 'zipalign'),
- os.path.join(config['sdk_path'], 'build-tools', config['build_tools'], 'zipalign'),
- ],
- 'android': [
- os.path.join(config['sdk_path'], 'tools', 'android'),
- ],
- 'adb': [
- os.path.join(config['sdk_path'], 'platform-tools', 'adb'),
- ],
- }
-
- for b, paths in bin_paths.items():
- config[b] = None
- for path in paths:
- if os.path.isfile(path):
- config[b] = path
- break
- if config[b] is None:
- logging.warn("Could not find %s in any of the following paths:\n%s" % (
- b, '\n'.join(paths)))
+ fill_config_defaults(config)
# There is no standard, so just set up the most common environment
# variables
env = os.environ
+ orig_path = env['PATH']
for n in ['ANDROID_HOME', 'ANDROID_SDK']:
env[n] = config['sdk_path']
- for n in ['ANDROID_NDK', 'NDK']:
- env[n] = config['ndk_path']
for k in ["keystorepass", "keypass"]:
if k in config:
@@ -190,37 +200,82 @@ def read_config(opts, config_file='config.py'):
return config
-def test_sdk_exists(c):
- if c['sdk_path'] is None:
- # c['sdk_path'] is set to the value of ANDROID_HOME by default
- logging.error('No Android SDK found! ANDROID_HOME is not set and sdk_path is not in config.py!')
+def get_ndk_path(version):
+ if version is None:
+ version = 'r10e' # falls back to latest
+ paths = config['ndk_paths']
+ if version not in paths:
+ return ''
+ return paths[version] or ''
+
+
+def find_sdk_tools_cmd(cmd):
+ '''find a working path to a tool from the Android SDK'''
+
+ tooldirs = []
+ if config is not None and 'sdk_path' in config and os.path.exists(config['sdk_path']):
+ # try to find a working path to this command, in all the recent possible paths
+ if 'build_tools' in config:
+ build_tools = os.path.join(config['sdk_path'], 'build-tools')
+ # if 'build_tools' was manually set and exists, check only that one
+ configed_build_tools = os.path.join(build_tools, config['build_tools'])
+ if os.path.exists(configed_build_tools):
+ tooldirs.append(configed_build_tools)
+ else:
+ # no configed version, so hunt known paths for it
+ for f in sorted(os.listdir(build_tools), reverse=True):
+ if os.path.isdir(os.path.join(build_tools, f)):
+ tooldirs.append(os.path.join(build_tools, f))
+ tooldirs.append(build_tools)
+ sdk_tools = os.path.join(config['sdk_path'], 'tools')
+ if os.path.exists(sdk_tools):
+ tooldirs.append(sdk_tools)
+ sdk_platform_tools = os.path.join(config['sdk_path'], 'platform-tools')
+ if os.path.exists(sdk_platform_tools):
+ tooldirs.append(sdk_platform_tools)
+ tooldirs.append('/usr/bin')
+ for d in tooldirs:
+ if os.path.isfile(os.path.join(d, cmd)):
+ return os.path.join(d, cmd)
+ # did not find the command, exit with error message
+ ensure_build_tools_exists(config)
+
+
+def test_sdk_exists(thisconfig):
+ if 'sdk_path' not in thisconfig:
+ if 'aapt' in thisconfig and os.path.isfile(thisconfig['aapt']):
+ return True
+ else:
+ logging.error("'sdk_path' not set in config.py!")
+ return False
+ if thisconfig['sdk_path'] == default_config['sdk_path']:
+ logging.error('No Android SDK found!')
logging.error('You can use ANDROID_HOME to set the path to your SDK, i.e.:')
logging.error('\texport ANDROID_HOME=/opt/android-sdk')
return False
- if not os.path.exists(c['sdk_path']):
- logging.critical('Android SDK path "' + c['sdk_path'] + '" does not exist!')
+ if not os.path.exists(thisconfig['sdk_path']):
+ logging.critical('Android SDK path "' + thisconfig['sdk_path'] + '" does not exist!')
return False
- if not os.path.isdir(c['sdk_path']):
- logging.critical('Android SDK path "' + c['sdk_path'] + '" is not a directory!')
+ if not os.path.isdir(thisconfig['sdk_path']):
+ logging.critical('Android SDK path "' + thisconfig['sdk_path'] + '" is not a directory!')
return False
for d in ['build-tools', 'platform-tools', 'tools']:
- if not os.path.isdir(os.path.join(c['sdk_path'], d)):
+ if not os.path.isdir(os.path.join(thisconfig['sdk_path'], d)):
logging.critical('Android SDK path "%s" does not contain "%s/"!' % (
- c['sdk_path'], d))
+ thisconfig['sdk_path'], d))
return False
return True
-def test_build_tools_exists(c):
- if not test_sdk_exists(c):
- return False
- build_tools = os.path.join(c['sdk_path'], 'build-tools')
- versioned_build_tools = os.path.join(build_tools, c['build_tools'])
+def ensure_build_tools_exists(thisconfig):
+ if not test_sdk_exists(thisconfig):
+ sys.exit(3)
+ build_tools = os.path.join(thisconfig['sdk_path'], 'build-tools')
+ versioned_build_tools = os.path.join(build_tools, thisconfig['build_tools'])
if not os.path.isdir(versioned_build_tools):
logging.critical('Android Build Tools path "'
+ versioned_build_tools + '" does not exist!')
- return False
- return True
+ sys.exit(3)
def write_password_file(pwtype, password=None):
@@ -380,12 +435,15 @@ def getsrclibvcs(name):
class vcs:
+
def __init__(self, remote, local):
# svn, git-svn and bzr may require auth
self.username = None
if self.repotype() in ('git-svn', 'bzr'):
if '@' in remote:
+ if self.repotype == 'git-svn':
+ raise VCSException("Authentication is not supported for git-svn")
self.username, remote = remote.split('@')
if ':' not in self.username:
raise VCSException("Password required with username")
@@ -407,7 +465,7 @@ class vcs:
# lifetime of the vcs object.
# None is acceptable for 'rev' if you know you are cloning a clean copy of
# the repo - otherwise it must specify a valid revision.
- def gotorevision(self, rev):
+ def gotorevision(self, rev, refresh=True):
if self.clone_failed:
raise VCSException("Downloading the repository already failed once, not trying again.")
@@ -428,9 +486,8 @@ class vcs:
writeback = False
else:
deleterepo = True
- logging.info(
- "Repository details for %s changed - deleting" % (
- self.local))
+ logging.info("Repository details for %s changed - deleting" % (
+ self.local))
else:
deleterepo = True
logging.info("Repository details for %s missing - deleting" % (
@@ -439,6 +496,8 @@ class vcs:
shutil.rmtree(self.local)
exc = None
+ if not refresh:
+ self.refreshed = True
try:
self.gotorevisionx(rev)
@@ -464,10 +523,22 @@ class vcs:
# Get a list of all known tags
def gettags(self):
- raise VCSException('gettags not supported for this vcs type')
-
- # Get a list of latest number tags
- def latesttags(self, number):
+ if not self._gettags:
+ raise VCSException('gettags not supported for this vcs type')
+ rtags = []
+ for tag in self._gettags():
+ if re.match('[-A-Za-z0-9_. ]+$', tag):
+ rtags.append(tag)
+ return rtags
+
+ def latesttags(self, tags, number):
+ """Get the most recent tags in a given list.
+
+ :param tags: a list of tags
+ :param number: the number to return
+ :returns: A list containing the most recent tags in the provided
+ list, up to the maximum number given.
+ """
raise VCSException('latesttags not supported for this vcs type')
# Get current commit reference (hash, revision, etc)
@@ -490,7 +561,7 @@ class vcs_git(vcs):
# fdroidserver) and then we'll proceed to destroy it! This is called as
# a safety check.
def checkrepo(self):
- p = SilentPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local)
+ p = FDroidPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local, output=False)
result = p.output.rstrip()
if not result.endswith(self.local):
raise VCSException('Repository mismatch')
@@ -506,12 +577,14 @@ class vcs_git(vcs):
else:
self.checkrepo()
# Discard any working tree changes
- p = SilentPopen(['git', 'reset', '--hard'], cwd=self.local)
+ p = FDroidPopen(['git', 'submodule', 'foreach', '--recursive',
+ 'git', 'reset', '--hard'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git reset failed", p.output)
# Remove untracked files now, in case they're tracked in the target
# revision (it happens!)
- p = SilentPopen(['git', 'clean', '-dffx'], cwd=self.local)
+ p = FDroidPopen(['git', 'submodule', 'foreach', '--recursive',
+ 'git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
if not self.refreshed:
@@ -519,28 +592,28 @@ class vcs_git(vcs):
p = FDroidPopen(['git', 'fetch', 'origin'], cwd=self.local)
if p.returncode != 0:
raise VCSException("Git fetch failed", p.output)
- p = SilentPopen(['git', 'fetch', '--prune', '--tags', 'origin'], cwd=self.local)
+ p = FDroidPopen(['git', 'fetch', '--prune', '--tags', 'origin'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git fetch failed", p.output)
# Recreate origin/HEAD as git clone would do it, in case it disappeared
- p = SilentPopen(['git', 'remote', 'set-head', 'origin', '--auto'], cwd=self.local)
+ p = FDroidPopen(['git', 'remote', 'set-head', 'origin', '--auto'], cwd=self.local, output=False)
if p.returncode != 0:
lines = p.output.splitlines()
if 'Multiple remote HEAD branches' not in lines[0]:
raise VCSException("Git remote set-head failed", p.output)
branch = lines[1].split(' ')[-1]
- p2 = SilentPopen(['git', 'remote', 'set-head', 'origin', branch], cwd=self.local)
+ p2 = FDroidPopen(['git', 'remote', 'set-head', 'origin', branch], cwd=self.local, output=False)
if p2.returncode != 0:
raise VCSException("Git remote set-head failed", p.output + '\n' + p2.output)
self.refreshed = True
# origin/HEAD is the HEAD of the remote, e.g. the "default branch" on
# a github repo. Most of the time this is the same as origin/master.
rev = rev or 'origin/HEAD'
- p = SilentPopen(['git', 'checkout', '-f', rev], cwd=self.local)
+ p = FDroidPopen(['git', 'checkout', '-f', rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git checkout of '%s' failed" % rev, p.output)
# Get rid of any uncontrolled files left behind
- p = SilentPopen(['git', 'clean', '-dffx'], cwd=self.local)
+ p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
@@ -559,32 +632,33 @@ class vcs_git(vcs):
line = line.replace('git@github.com:', 'https://github.com/')
f.write(line)
- for cmd in [
- ['git', 'reset', '--hard'],
- ['git', 'clean', '-dffx'],
- ]:
- p = SilentPopen(['git', 'submodule', 'foreach', '--recursive'] + cmd, cwd=self.local)
- if p.returncode != 0:
- raise VCSException("Git submodule reset failed", p.output)
- p = SilentPopen(['git', 'submodule', 'sync'], cwd=self.local)
+ p = FDroidPopen(['git', 'submodule', 'sync'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git submodule sync failed", p.output)
p = FDroidPopen(['git', 'submodule', 'update', '--init', '--force', '--recursive'], cwd=self.local)
if p.returncode != 0:
raise VCSException("Git submodule update failed", p.output)
- def gettags(self):
+ def _gettags(self):
self.checkrepo()
- p = SilentPopen(['git', 'tag'], cwd=self.local)
+ p = FDroidPopen(['git', 'tag'], cwd=self.local, output=False)
return p.output.splitlines()
- def latesttags(self, alltags, number):
+ def latesttags(self, tags, number):
self.checkrepo()
- p = SilentPopen(['echo "' + '\n'.join(alltags) + '" | '
- + 'xargs -I@ git log --format=format:"%at @%n" -1 @ | '
- + 'sort -n | awk \'{print $2}\''],
- cwd=self.local, shell=True)
- return p.output.splitlines()[-number:]
+ tl = []
+ for tag in tags:
+ p = FDroidPopen(
+ ['git', 'show', '--format=format:%ct', '-s', tag],
+ cwd=self.local, output=False)
+ # Timestamp is on the last line. For a normal tag, it's the only
+ # line, but for annotated tags, the rest of the info precedes it.
+ ts = int(p.output.splitlines()[-1])
+ tl.append((ts, tag))
+ latest = []
+ for _, t in sorted(tl)[-number:]:
+ latest.append(t)
+ return latest
class vcs_gitsvn(vcs):
@@ -592,19 +666,12 @@ class vcs_gitsvn(vcs):
def repotype(self):
return 'git-svn'
- # Damn git-svn tries to use a graphical password prompt, so we have to
- # trick it into taking the password from stdin
- def userargs(self):
- if self.username is None:
- return ('', '')
- return ('echo "%s" | DISPLAY="" ' % self.password, ' --username "%s"' % self.username)
-
# If the local directory exists, but is somehow not a git repository, git
# will traverse up the directory tree until it finds one that is (i.e.
# fdroidserver) and then we'll proceed to destory it! This is called as
# a safety check.
def checkrepo(self):
- p = SilentPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local)
+ p = FDroidPopen(['git', 'rev-parse', '--show-toplevel'], cwd=self.local, output=False)
result = p.output.rstrip()
if not result.endswith(self.local):
raise VCSException('Repository mismatch')
@@ -612,22 +679,24 @@ class vcs_gitsvn(vcs):
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
# Brand new checkout
- gitsvn_cmd = '%sgit svn clone%s' % self.userargs()
+ gitsvn_args = ['git', 'svn', 'clone']
if ';' in self.remote:
remote_split = self.remote.split(';')
for i in remote_split[1:]:
if i.startswith('trunk='):
- gitsvn_cmd += ' -T %s' % i[6:]
+ gitsvn_args.extend(['-T', i[6:]])
elif i.startswith('tags='):
- gitsvn_cmd += ' -t %s' % i[5:]
+ gitsvn_args.extend(['-t', i[5:]])
elif i.startswith('branches='):
- gitsvn_cmd += ' -b %s' % i[9:]
- p = SilentPopen([gitsvn_cmd + " %s %s" % (remote_split[0], self.local)], shell=True)
+ gitsvn_args.extend(['-b', i[9:]])
+ gitsvn_args.extend([remote_split[0], self.local])
+ p = FDroidPopen(gitsvn_args, output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Git svn clone failed", p.output)
else:
- p = SilentPopen([gitsvn_cmd + " %s %s" % (self.remote, self.local)], shell=True)
+ gitsvn_args.extend([self.remote, self.local])
+ p = FDroidPopen(gitsvn_args, output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Git svn clone failed", p.output)
@@ -635,20 +704,20 @@ class vcs_gitsvn(vcs):
else:
self.checkrepo()
# Discard any working tree changes
- p = SilentPopen(['git', 'reset', '--hard'], cwd=self.local)
+ p = FDroidPopen(['git', 'reset', '--hard'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git reset failed", p.output)
# Remove untracked files now, in case they're tracked in the target
# revision (it happens!)
- p = SilentPopen(['git', 'clean', '-dffx'], cwd=self.local)
+ p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
if not self.refreshed:
# Get new commits, branches and tags from repo
- p = SilentPopen(['%sgit svn fetch %s' % self.userargs()], cwd=self.local, shell=True)
+ p = FDroidPopen(['git', 'svn', 'fetch'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git svn fetch failed")
- p = SilentPopen(['%sgit svn rebase %s' % self.userargs()], cwd=self.local, shell=True)
+ p = FDroidPopen(['git', 'svn', 'rebase'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git svn rebase failed", p.output)
self.refreshed = True
@@ -658,8 +727,7 @@ class vcs_gitsvn(vcs):
nospaces_rev = rev.replace(' ', '%20')
# Try finding a svn tag
for treeish in ['origin/', '']:
- p = SilentPopen(['git', 'checkout', treeish + 'tags/' + nospaces_rev],
- cwd=self.local)
+ p = FDroidPopen(['git', 'checkout', treeish + 'tags/' + nospaces_rev], cwd=self.local, output=False)
if p.returncode == 0:
break
if p.returncode != 0:
@@ -680,8 +748,7 @@ class vcs_gitsvn(vcs):
svn_rev = svn_rev if svn_rev[0] == 'r' else 'r' + svn_rev
- p = SilentPopen(['git', 'svn', 'find-rev', '--before', svn_rev, treeish],
- cwd=self.local)
+ p = FDroidPopen(['git', 'svn', 'find-rev', '--before', svn_rev, treeish], cwd=self.local, output=False)
git_rev = p.output.rstrip()
if p.returncode == 0 and git_rev:
@@ -689,21 +756,21 @@ class vcs_gitsvn(vcs):
if p.returncode != 0 or not git_rev:
# Try a plain git checkout as a last resort
- p = SilentPopen(['git', 'checkout', rev], cwd=self.local)
+ p = FDroidPopen(['git', 'checkout', rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("No git treeish found and direct git checkout of '%s' failed" % rev, p.output)
else:
# Check out the git rev equivalent to the svn rev
- p = SilentPopen(['git', 'checkout', git_rev], cwd=self.local)
+ p = FDroidPopen(['git', 'checkout', git_rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git checkout of '%s' failed" % rev, p.output)
# Get rid of any uncontrolled files left behind
- p = SilentPopen(['git', 'clean', '-dffx'], cwd=self.local)
+ p = FDroidPopen(['git', 'clean', '-dffx'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Git clean failed", p.output)
- def gettags(self):
+ def _gettags(self):
self.checkrepo()
for treeish in ['origin/', '']:
d = os.path.join(self.local, '.git', 'svn', 'refs', 'remotes', treeish, 'tags')
@@ -712,7 +779,7 @@ class vcs_gitsvn(vcs):
def getref(self):
self.checkrepo()
- p = SilentPopen(['git', 'svn', 'find-rev', 'HEAD'], cwd=self.local)
+ p = FDroidPopen(['git', 'svn', 'find-rev', 'HEAD'], cwd=self.local, output=False)
if p.returncode != 0:
return None
return p.output.strip()
@@ -725,16 +792,20 @@ class vcs_hg(vcs):
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
- p = SilentPopen(['hg', 'clone', self.remote, self.local])
+ p = FDroidPopen(['hg', 'clone', self.remote, self.local], output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Hg clone failed", p.output)
else:
- p = SilentPopen(['hg status -uS | xargs rm -rf'], cwd=self.local, shell=True)
+ p = FDroidPopen(['hg', 'status', '-uS'], cwd=self.local, output=False)
if p.returncode != 0:
- raise VCSException("Hg clean failed", p.output)
+ raise VCSException("Hg status failed", p.output)
+ for line in p.output.splitlines():
+ if not line.startswith('? '):
+ raise VCSException("Unexpected output from hg status -uS: " + line)
+ FDroidPopen(['rm', '-rf', line[2:]], cwd=self.local, output=False)
if not self.refreshed:
- p = SilentPopen(['hg', 'pull'], cwd=self.local)
+ p = FDroidPopen(['hg', 'pull'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Hg pull failed", p.output)
self.refreshed = True
@@ -742,22 +813,22 @@ class vcs_hg(vcs):
rev = rev or 'default'
if not rev:
return
- p = SilentPopen(['hg', 'update', '-C', rev], cwd=self.local)
+ p = FDroidPopen(['hg', 'update', '-C', rev], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Hg checkout of '%s' failed" % rev, p.output)
- p = SilentPopen(['hg', 'purge', '--all'], cwd=self.local)
+ p = FDroidPopen(['hg', 'purge', '--all'], cwd=self.local, output=False)
# Also delete untracked files, we have to enable purge extension for that:
if "'purge' is provided by the following extension" in p.output:
with open(os.path.join(self.local, '.hg', 'hgrc'), "a") as myfile:
myfile.write("\n[extensions]\nhgext.purge=\n")
- p = SilentPopen(['hg', 'purge', '--all'], cwd=self.local)
+ p = FDroidPopen(['hg', 'purge', '--all'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("HG purge failed", p.output)
elif p.returncode != 0:
raise VCSException("HG purge failed", p.output)
- def gettags(self):
- p = SilentPopen(['hg', 'tags', '-q'], cwd=self.local)
+ def _gettags(self):
+ p = FDroidPopen(['hg', 'tags', '-q'], cwd=self.local, output=False)
return p.output.splitlines()[1:]
@@ -768,64 +839,72 @@ class vcs_bzr(vcs):
def gotorevisionx(self, rev):
if not os.path.exists(self.local):
- p = SilentPopen(['bzr', 'branch', self.remote, self.local])
+ p = FDroidPopen(['bzr', 'branch', self.remote, self.local], output=False)
if p.returncode != 0:
self.clone_failed = True
raise VCSException("Bzr branch failed", p.output)
else:
- p = SilentPopen(['bzr', 'clean-tree', '--force', '--unknown', '--ignored'], cwd=self.local)
+ p = FDroidPopen(['bzr', 'clean-tree', '--force', '--unknown', '--ignored'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Bzr revert failed", p.output)
if not self.refreshed:
- p = SilentPopen(['bzr', 'pull'], cwd=self.local)
+ p = FDroidPopen(['bzr', 'pull'], cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Bzr update failed", p.output)
self.refreshed = True
revargs = list(['-r', rev] if rev else [])
- p = SilentPopen(['bzr', 'revert'] + revargs, cwd=self.local)
+ p = FDroidPopen(['bzr', 'revert'] + revargs, cwd=self.local, output=False)
if p.returncode != 0:
raise VCSException("Bzr revert of '%s' failed" % rev, p.output)
- def gettags(self):
- p = SilentPopen(['bzr', 'tags'], cwd=self.local)
+ def _gettags(self):
+ p = FDroidPopen(['bzr', 'tags'], cwd=self.local, output=False)
return [tag.split(' ')[0].strip() for tag in
p.output.splitlines()]
-def retrieve_string(app_dir, string, xmlfiles=None):
+def unescape_string(string):
+ if string[0] == '"' and string[-1] == '"':
+ return string[1:-1]
+
+ return string.replace("\\'", "'")
+
- res_dirs = [
- os.path.join(app_dir, 'res'),
- os.path.join(app_dir, 'src', 'main'),
- ]
+def retrieve_string(app_dir, string, xmlfiles=None):
if xmlfiles is None:
xmlfiles = []
- for res_dir in res_dirs:
+ for res_dir in [
+ os.path.join(app_dir, 'res'),
+ os.path.join(app_dir, 'src', 'main', 'res'),
+ ]:
for r, d, f in os.walk(res_dir):
if os.path.basename(r) == 'values':
xmlfiles += [os.path.join(r, x) for x in f if x.endswith('.xml')]
- string_search = None
- if string.startswith('@string/'):
- string_search = re.compile(r'.*name="' + string[8:] + '".*?>"?([^<]+?)"?<.*').search
- elif string.startswith('&') and string.endswith(';'):
- string_search = re.compile(r'.*').search
-
- if string_search is not None:
- for xmlfile in xmlfiles:
- for line in file(xmlfile):
- matches = string_search(line)
- if matches:
- return retrieve_string(app_dir, matches.group(1), xmlfiles)
- return None
+ if not string.startswith('@string/'):
+ return unescape_string(string)
- return string.replace("\\'", "'")
+ name = string[len('@string/'):]
+
+ for path in xmlfiles:
+ if not os.path.isfile(path):
+ continue
+ xml = parse_xml(path)
+ element = xml.find('string[@name="' + name + '"]')
+ if element is not None and element.text is not None:
+ return retrieve_string(app_dir, element.text.encode('utf-8'), xmlfiles)
+
+ return ''
+
+
+def retrieve_string_singleline(app_dir, string, xmlfiles=None):
+ return retrieve_string(app_dir, string, xmlfiles).replace('\n', ' ').strip()
# Return list of existing files that will be used to find the highest vercode
-def manifest_paths(app_dir, flavour):
+def manifest_paths(app_dir, flavours):
possible_manifests = \
[os.path.join(app_dir, 'AndroidManifest.xml'),
@@ -833,7 +912,9 @@ def manifest_paths(app_dir, flavour):
os.path.join(app_dir, 'src', 'AndroidManifest.xml'),
os.path.join(app_dir, 'build.gradle')]
- if flavour:
+ for flavour in flavours:
+ if flavour == 'yes':
+ continue
possible_manifests.append(
os.path.join(app_dir, 'src', flavour, 'AndroidManifest.xml'))
@@ -841,39 +922,21 @@ def manifest_paths(app_dir, flavour):
# Retrieve the package name. Returns the name, or None if not found.
-def fetch_real_name(app_dir, flavour):
- app_search = re.compile(r'.* %s" % ' '.join(commands))
result = PopenResult()
- p = subprocess.Popen(commands, cwd=cwd, shell=shell, env=env,
- stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+ p = None
+ try:
+ p = subprocess.Popen(commands, cwd=cwd, shell=False, env=env,
+ stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
+ except OSError, e:
+ raise BuildException("OSError while trying to execute " +
+ ' '.join(commands) + ': ' + str(e))
stdout_queue = Queue.Queue()
stdout_reader = AsynchronousFileReader(p.stdout, stdout_queue)
@@ -1684,8 +1823,9 @@ def remove_signing_keys(build_dir):
re.compile(r'^[\t ]*signingConfig [^ ]*$'),
re.compile(r'.*android\.signingConfigs\.[^{]*$'),
re.compile(r'.*variant\.outputFile = .*'),
+ re.compile(r'.*output\.outputFile = .*'),
re.compile(r'.*\.readLine\(.*'),
- ]
+ ]
for root, dirs, files in os.walk(build_dir):
if 'build.gradle' in files:
path = os.path.join(root, 'build.gradle')
@@ -1696,8 +1836,15 @@ def remove_signing_keys(build_dir):
changed = False
opened = 0
+ i = 0
with open(path, "w") as o:
- for line in lines:
+ while i < len(lines):
+ line = lines[i]
+ i += 1
+ while line.endswith('\\\n'):
+ line = line.rstrip('\\\n') + lines[i]
+ i += 1
+
if comment.match(line):
continue
@@ -1725,8 +1872,7 @@ def remove_signing_keys(build_dir):
'project.properties',
'build.properties',
'default.properties',
- 'ant.properties',
- ]:
+ 'ant.properties', ]:
if propfile in files:
path = os.path.join(root, propfile)
@@ -1747,10 +1893,30 @@ def remove_signing_keys(build_dir):
logging.info("Cleaned %s of keysigning configs at %s" % (propfile, path))
-def replace_config_vars(cmd):
+def reset_env_path():
+ global env, orig_path
+ env['PATH'] = orig_path
+
+
+def add_to_env_path(path):
+ global env
+ paths = env['PATH'].split(os.pathsep)
+ if path in paths:
+ return
+ paths.append(path)
+ env['PATH'] = os.pathsep.join(paths)
+
+
+def replace_config_vars(cmd, build):
+ global env
cmd = cmd.replace('$$SDK$$', config['sdk_path'])
- cmd = cmd.replace('$$NDK$$', config['ndk_path'])
+ # env['ANDROID_NDK'] is set in build_local right before prepare_source
+ cmd = cmd.replace('$$NDK$$', env['ANDROID_NDK'])
cmd = cmd.replace('$$MVN3$$', config['mvn3'])
+ if build is not None:
+ cmd = cmd.replace('$$COMMIT$$', build['commit'])
+ cmd = cmd.replace('$$VERSION$$', build['version'])
+ cmd = cmd.replace('$$VERCODE$$', build['vercode'])
return cmd
@@ -1775,3 +1941,197 @@ def place_srclib(root_dir, number, libpath):
o.write(line)
if not placed:
o.write('android.library.reference.%d=%s\n' % (number, relpath))
+
+
+def verify_apks(signed_apk, unsigned_apk, tmp_dir):
+ """Verify that two apks are the same
+
+ One of the inputs is signed, the other is unsigned. The signature metadata
+ is transferred from the signed to the unsigned apk, and then jarsigner is
+ used to verify that the signature from the signed apk is also varlid for
+ the unsigned one.
+ :param signed_apk: Path to a signed apk file
+ :param unsigned_apk: Path to an unsigned apk file expected to match it
+ :param tmp_dir: Path to directory for temporary files
+ :returns: None if the verification is successful, otherwise a string
+ describing what went wrong.
+ """
+ sigfile = re.compile(r'META-INF/[0-9A-Za-z]+\.(SF|RSA)')
+ with ZipFile(signed_apk) as signed_apk_as_zip:
+ meta_inf_files = ['META-INF/MANIFEST.MF']
+ for f in signed_apk_as_zip.namelist():
+ if sigfile.match(f):
+ meta_inf_files.append(f)
+ if len(meta_inf_files) < 3:
+ return "Signature files missing from {0}".format(signed_apk)
+ signed_apk_as_zip.extractall(tmp_dir, meta_inf_files)
+ with ZipFile(unsigned_apk, mode='a') as unsigned_apk_as_zip:
+ for meta_inf_file in meta_inf_files:
+ unsigned_apk_as_zip.write(os.path.join(tmp_dir, meta_inf_file), arcname=meta_inf_file)
+
+ if subprocess.call(['jarsigner', '-verify', unsigned_apk]) != 0:
+ logging.info("...NOT verified - {0}".format(signed_apk))
+ return compare_apks(signed_apk, unsigned_apk, tmp_dir)
+ logging.info("...successfully verified")
+ return None
+
+
+def compare_apks(apk1, apk2, tmp_dir):
+ """Compare two apks
+
+ Returns None if the apk content is the same (apart from the signing key),
+ otherwise a string describing what's different, or what went wrong when
+ trying to do the comparison.
+ """
+
+ badchars = re.compile('''[/ :;'"]''')
+ apk1dir = os.path.join(tmp_dir, badchars.sub('_', apk1[0:-4])) # trim .apk
+ apk2dir = os.path.join(tmp_dir, badchars.sub('_', apk2[0:-4])) # trim .apk
+ for d in [apk1dir, apk2dir]:
+ if os.path.exists(d):
+ shutil.rmtree(d)
+ os.mkdir(d)
+ os.mkdir(os.path.join(d, 'jar-xf'))
+
+ if subprocess.call(['jar', 'xf',
+ os.path.abspath(apk1)],
+ cwd=os.path.join(apk1dir, 'jar-xf')) != 0:
+ return("Failed to unpack " + apk1)
+ if subprocess.call(['jar', 'xf',
+ os.path.abspath(apk2)],
+ cwd=os.path.join(apk2dir, 'jar-xf')) != 0:
+ return("Failed to unpack " + apk2)
+
+ # try to find apktool in the path, if it hasn't been manually configed
+ if 'apktool' not in config:
+ tmp = find_command('apktool')
+ if tmp is not None:
+ config['apktool'] = tmp
+ if 'apktool' in config:
+ if subprocess.call([config['apktool'], 'd', os.path.abspath(apk1), '--output', 'apktool'],
+ cwd=apk1dir) != 0:
+ return("Failed to unpack " + apk1)
+ if subprocess.call([config['apktool'], 'd', os.path.abspath(apk2), '--output', 'apktool'],
+ cwd=apk2dir) != 0:
+ return("Failed to unpack " + apk2)
+
+ p = FDroidPopen(['diff', '-r', apk1dir, apk2dir], output=False)
+ lines = p.output.splitlines()
+ if len(lines) != 1 or 'META-INF' not in lines[0]:
+ meld = find_command('meld')
+ if meld is not None:
+ p = FDroidPopen(['meld', apk1dir, apk2dir], output=False)
+ return("Unexpected diff output - " + p.output)
+
+ # since everything verifies, delete the comparison to keep cruft down
+ shutil.rmtree(apk1dir)
+ shutil.rmtree(apk2dir)
+
+ # If we get here, it seems like they're the same!
+ return None
+
+
+def find_command(command):
+ '''find the full path of a command, or None if it can't be found in the PATH'''
+
+ def is_exe(fpath):
+ return os.path.isfile(fpath) and os.access(fpath, os.X_OK)
+
+ fpath, fname = os.path.split(command)
+ if fpath:
+ if is_exe(command):
+ return command
+ else:
+ for path in os.environ["PATH"].split(os.pathsep):
+ path = path.strip('"')
+ exe_file = os.path.join(path, command)
+ if is_exe(exe_file):
+ return exe_file
+
+ return None
+
+
+def genpassword():
+ '''generate a random password for when generating keys'''
+ h = hashlib.sha256()
+ h.update(os.urandom(16)) # salt
+ h.update(bytes(socket.getfqdn()))
+ return h.digest().encode('base64').strip()
+
+
+def genkeystore(localconfig):
+ '''Generate a new key with random passwords and add it to new keystore'''
+ logging.info('Generating a new key in "' + localconfig['keystore'] + '"...')
+ keystoredir = os.path.dirname(localconfig['keystore'])
+ if keystoredir is None or keystoredir == '':
+ keystoredir = os.path.join(os.getcwd(), keystoredir)
+ if not os.path.exists(keystoredir):
+ os.makedirs(keystoredir, mode=0o700)
+
+ write_password_file("keystorepass", localconfig['keystorepass'])
+ write_password_file("keypass", localconfig['keypass'])
+ p = FDroidPopen(['keytool', '-genkey',
+ '-keystore', localconfig['keystore'],
+ '-alias', localconfig['repo_keyalias'],
+ '-keyalg', 'RSA', '-keysize', '4096',
+ '-sigalg', 'SHA256withRSA',
+ '-validity', '10000',
+ '-storepass:file', config['keystorepassfile'],
+ '-keypass:file', config['keypassfile'],
+ '-dname', localconfig['keydname']])
+ # TODO keypass should be sent via stdin
+ if p.returncode != 0:
+ raise BuildException("Failed to generate key", p.output)
+ os.chmod(localconfig['keystore'], 0o0600)
+ # now show the lovely key that was just generated
+ p = FDroidPopen(['keytool', '-list', '-v',
+ '-keystore', localconfig['keystore'],
+ '-alias', localconfig['repo_keyalias'],
+ '-storepass:file', config['keystorepassfile']])
+ logging.info(p.output.strip() + '\n\n')
+
+
+def write_to_config(thisconfig, key, value=None):
+ '''write a key/value to the local config.py'''
+ if value is None:
+ origkey = key + '_orig'
+ value = thisconfig[origkey] if origkey in thisconfig else thisconfig[key]
+ with open('config.py', 'r') as f:
+ data = f.read()
+ pattern = '\n[\s#]*' + key + '\s*=\s*"[^"]*"'
+ repl = '\n' + key + ' = "' + value + '"'
+ data = re.sub(pattern, repl, data)
+ # if this key is not in the file, append it
+ if not re.match('\s*' + key + '\s*=\s*"', data):
+ data += repl
+ # make sure the file ends with a carraige return
+ if not re.match('\n$', data):
+ data += '\n'
+ with open('config.py', 'w') as f:
+ f.writelines(data)
+
+
+def parse_xml(path):
+ return XMLElementTree.parse(path).getroot()
+
+
+def string_is_integer(string):
+ try:
+ int(string)
+ return True
+ except ValueError:
+ return False
+
+
+def download_file(url, local_filename=None, dldir='tmp'):
+ filename = url.split('/')[-1]
+ if local_filename is None:
+ local_filename = os.path.join(dldir, filename)
+ # the stream=True parameter keeps memory usage low
+ r = requests.get(url, stream=True)
+ with open(local_filename, 'wb') as f:
+ for chunk in r.iter_content(chunk_size=1024):
+ if chunk: # filter out keep-alive new chunks
+ f.write(chunk)
+ f.flush()
+ return local_filename
diff --git a/fdroidserver/gpgsign.py b/fdroidserver/gpgsign.py
index fa874cb8..0358e99e 100644
--- a/fdroidserver/gpgsign.py
+++ b/fdroidserver/gpgsign.py
@@ -61,10 +61,13 @@ def main():
sigpath = os.path.join(output_dir, sigfilename)
if not os.path.exists(sigpath):
- p = FDroidPopen(['gpg', '-a',
- '--output', sigpath,
- '--detach-sig',
- os.path.join(output_dir, apkfilename)])
+ gpgargs = ['gpg', '-a',
+ '--output', sigpath,
+ '--detach-sig']
+ if 'gpghome' in config:
+ gpgargs.extend(['--homedir', config['gpghome']])
+ gpgargs.append(os.path.join(output_dir, apkfilename))
+ p = FDroidPopen(gpgargs)
if p.returncode != 0:
logging.error("Signing failed.")
sys.exit(1)
diff --git a/fdroidserver/import.py b/fdroidserver/import.py
index be9fe12f..27a93ca6 100644
--- a/fdroidserver/import.py
+++ b/fdroidserver/import.py
@@ -40,7 +40,7 @@ def getrepofrompage(url):
return (None, 'Unable to get ' + url + ' - return code ' + str(req.getcode()))
page = req.read()
- # Works for Google Code and BitBucket...
+ # Works for BitBucket
index = page.find('hg clone')
if index != -1:
repotype = 'hg'
@@ -52,7 +52,7 @@ def getrepofrompage(url):
repo = repo.split('"')[0]
return (repotype, repo)
- # Works for Google Code and BitBucket...
+ # Works for BitBucket
index = page.find('git clone')
if index != -1:
repotype = 'git'
@@ -64,26 +64,6 @@ def getrepofrompage(url):
repo = repo.split('"')[0]
return (repotype, repo)
- # Google Code only...
- index = page.find('svn checkout')
- if index != -1:
- repotype = 'git-svn'
- repo = page[index + 13:]
- prefix = 'http'
- if not repo.startswith(prefix):
- return (None, "Unexpected checkout instructions format")
- repo = 'http' + repo[len(prefix):]
- index = repo.find('<')
- if index == -1:
- return (None, "Error while getting repo address - no end tag? '" + repo + "'")
- repo = repo[:index]
- index = repo.find(' ')
- if index == -1:
- return (None, "Error while getting repo address - no space? '" + repo + "'")
- repo = repo[:index]
- repo = repo.split('"')[0]
- return (repotype, repo)
-
return (None, "No information found." + page)
config = None
@@ -104,8 +84,6 @@ def main():
help="Project URL to import from.")
parser.add_option("-s", "--subdir", default=None,
help="Path to main android project subdirectory, if not in root.")
- parser.add_option("-r", "--repo", default=None,
- help="Allows a different repo to be specified for a multi-repo google code project")
parser.add_option("--rev", default=None,
help="Allows a different revision (or git branch) to be specified for the initial import")
(options, args) = parser.parse_args()
@@ -142,17 +120,13 @@ def main():
repotype = 'git'
sourcecode = url
issuetracker = url + '/issues'
+ website = ""
elif url.startswith('https://gitlab.com/'):
projecttype = 'gitlab'
repo = url
repotype = 'git'
- sourcecode = url
+ sourcecode = url + '/tree/HEAD'
issuetracker = url + '/issues'
- elif url.startswith('https://gitorious.org/'):
- projecttype = 'gitorious'
- repo = 'https://git.gitorious.org/' + url[22:] + '.git'
- repotype = 'git'
- sourcecode = url
elif url.startswith('https://bitbucket.org/'):
if url.endswith('/'):
url = url[:-1]
@@ -164,68 +138,22 @@ def main():
if not repotype:
logging.error("Unable to determine vcs type. " + repo)
sys.exit(1)
- elif (url.startswith('http://code.google.com/p/') or
- url.startswith('https://code.google.com/p/')):
- if not url.endswith('/'):
- url += '/'
- projecttype = 'googlecode'
- sourcecode = url + 'source/checkout'
- if options.repo:
- sourcecode += "?repo=" + options.repo
- issuetracker = url + 'issues/list'
-
- # Figure out the repo type and adddress...
- repotype, repo = getrepofrompage(sourcecode)
- if not repotype:
- logging.error("Unable to determine vcs type. " + repo)
- sys.exit(1)
-
- # Figure out the license...
- req = urllib.urlopen(url)
- if req.getcode() != 200:
- logging.error('Unable to find project page at ' + sourcecode + ' - return code ' + str(req.getcode()))
- sys.exit(1)
- page = req.read()
- index = page.find('Code license')
- if index == -1:
- logging.error("Couldn't find license data")
- sys.exit(1)
- ltext = page[index:]
- lprefix = 'rel="nofollow">'
- index = ltext.find(lprefix)
- if index == -1:
- logging.error("Couldn't find license text")
- sys.exit(1)
- ltext = ltext[index + len(lprefix):]
- index = ltext.find('<')
- if index == -1:
- logging.error("License text not formatted as expected")
- sys.exit(1)
- ltext = ltext[:index]
- if ltext == 'GNU GPL v3':
- license = 'GPLv3'
- elif ltext == 'GNU GPL v2':
- license = 'GPLv2'
- elif ltext == 'Apache License 2.0':
- license = 'Apache2'
- elif ltext == 'MIT License':
- license = 'MIT'
- elif ltext == 'GNU Lesser GPL':
- license = 'LGPL'
- elif ltext == 'Mozilla Public License 1.1':
- license = 'MPL'
- elif ltext == 'New BSD License':
- license = 'NewBSD'
- else:
- logging.error("License " + ltext + " is not recognised")
- sys.exit(1)
-
if not projecttype:
logging.error("Unable to determine the project type.")
logging.error("The URL you supplied was not in one of the supported formats. Please consult")
logging.error("the manual for a list of supported formats, and supply one of those.")
sys.exit(1)
+ # Ensure we have a sensible-looking repo address at this point. If not, we
+ # might have got a page format we weren't expecting. (Note that we
+ # specifically don't want git@...)
+ if ((repotype != 'bzr' and (not repo.startswith('http://') and
+ not repo.startswith('https://') and
+ not repo.startswith('git://'))) or
+ ' ' in repo):
+ logging.error("Repo address '{0}' does not seem to be valid".format(repo))
+ sys.exit(1)
+
# Get a copy of the source so we can extract some info...
logging.info('Getting source from ' + repotype + ' repo at ' + repo)
src_dir = os.path.join(tmp_dir, 'importer')
@@ -239,7 +167,7 @@ def main():
root_dir = src_dir
# Extract some information...
- paths = common.manifest_paths(root_dir, None)
+ paths = common.manifest_paths(root_dir, [])
if paths:
version, vercode, package = common.parse_androidmanifests(paths)
diff --git a/fdroidserver/init.py b/fdroidserver/init.py
index a3eded5b..0ed66d6b 100644
--- a/fdroidserver/init.py
+++ b/fdroidserver/init.py
@@ -20,7 +20,6 @@
# along with this program. If not, see .
import glob
-import hashlib
import os
import re
import shutil
@@ -30,23 +29,11 @@ from optparse import OptionParser
import logging
import common
-from common import FDroidPopen, BuildException
config = {}
options = None
-def write_to_config(key, value):
- '''write a key/value to the local config.py'''
- with open('config.py', 'r') as f:
- data = f.read()
- pattern = '\n[\s#]*' + key + '\s*=\s*"[^"]*"'
- repl = '\n' + key + ' = "' + value + '"'
- data = re.sub(pattern, repl, data)
- with open('config.py', 'w') as f:
- f.writelines(data)
-
-
def disable_in_config(key, value):
'''write a key/value to the local config.py, then comment it out'''
with open('config.py', 'r') as f:
@@ -58,37 +45,6 @@ def disable_in_config(key, value):
f.writelines(data)
-def genpassword():
- '''generate a random password for when generating keys'''
- h = hashlib.sha256()
- h.update(os.urandom(16)) # salt
- h.update(bytes(socket.getfqdn()))
- return h.digest().encode('base64').strip()
-
-
-def genkey(keystore, repo_keyalias, password, keydname):
- '''generate a new keystore with a new key in it for signing repos'''
- logging.info('Generating a new key in "' + keystore + '"...')
- common.write_password_file("keystorepass", password)
- common.write_password_file("keypass", password)
- p = FDroidPopen(['keytool', '-genkey',
- '-keystore', keystore, '-alias', repo_keyalias,
- '-keyalg', 'RSA', '-keysize', '4096',
- '-sigalg', 'SHA256withRSA',
- '-validity', '10000',
- '-storepass:file', config['keystorepassfile'],
- '-keypass:file', config['keypassfile'],
- '-dname', keydname])
- # TODO keypass should be sent via stdin
- if p.returncode != 0:
- raise BuildException("Failed to generate key", p.output)
- # now show the lovely key that was just generated
- p = FDroidPopen(['keytool', '-list', '-v',
- '-keystore', keystore, '-alias', repo_keyalias,
- '-storepass:file', config['keystorepassfile']])
- logging.info(p.output.strip() + '\n\n')
-
-
def main():
global options, config
@@ -114,36 +70,53 @@ def main():
# find root install prefix
tmp = os.path.dirname(sys.argv[0])
if os.path.basename(tmp) == 'bin':
- prefix = os.path.dirname(tmp)
- examplesdir = prefix + '/share/doc/fdroidserver/examples'
+ prefix = None
+ egg_link = os.path.join(tmp, '..', 'local/lib/python2.7/site-packages/fdroidserver.egg-link')
+ if os.path.exists(egg_link):
+ # installed from local git repo
+ examplesdir = os.path.join(open(egg_link).readline().rstrip(), 'examples')
+ else:
+ prefix = os.path.dirname(os.path.dirname(__file__)) # use .egg layout
+ if not prefix.endswith('.egg'): # use UNIX layout
+ prefix = os.path.dirname(tmp)
+ examplesdir = prefix + '/share/doc/fdroidserver/examples'
else:
# we're running straight out of the git repo
prefix = os.path.normpath(os.path.join(os.path.dirname(__file__), '..'))
examplesdir = prefix + '/examples'
+ aapt = None
fdroiddir = os.getcwd()
- test_config = common.get_default_config()
+ test_config = dict()
+ common.fill_config_defaults(test_config)
# track down where the Android SDK is, the default is to use the path set
# in ANDROID_HOME if that exists, otherwise None
if options.android_home is not None:
test_config['sdk_path'] = options.android_home
elif not common.test_sdk_exists(test_config):
- # if neither --android-home nor the default sdk_path exist, prompt the user
- default_sdk_path = '/opt/android-sdk'
- while not options.no_prompt:
- try:
- s = raw_input('Enter the path to the Android SDK ('
- + default_sdk_path + ') here:\n> ')
- except KeyboardInterrupt:
- print('')
- sys.exit(1)
- if re.match('^\s*$', s) is not None:
- test_config['sdk_path'] = default_sdk_path
- else:
- test_config['sdk_path'] = s
- if common.test_sdk_exists(test_config):
- break
+ if os.path.isfile('/usr/bin/aapt'):
+ # remove sdk_path and build_tools, they are not required
+ test_config.pop('sdk_path', None)
+ test_config.pop('build_tools', None)
+ # make sure at least aapt is found, since this can't do anything without it
+ test_config['aapt'] = common.find_sdk_tools_cmd('aapt')
+ else:
+ # if neither --android-home nor the default sdk_path exist, prompt the user
+ default_sdk_path = '/opt/android-sdk'
+ while not options.no_prompt:
+ try:
+ s = raw_input('Enter the path to the Android SDK ('
+ + default_sdk_path + ') here:\n> ')
+ except KeyboardInterrupt:
+ print('')
+ sys.exit(1)
+ if re.match('^\s*$', s) is not None:
+ test_config['sdk_path'] = default_sdk_path
+ else:
+ test_config['sdk_path'] = s
+ if common.test_sdk_exists(test_config):
+ break
if not common.test_sdk_exists(test_config):
sys.exit(3)
@@ -154,48 +127,45 @@ def main():
shutil.copy(os.path.join(examplesdir, 'fdroid-icon.png'), fdroiddir)
shutil.copyfile(os.path.join(examplesdir, 'config.py'), 'config.py')
os.chmod('config.py', 0o0600)
- write_to_config('sdk_path', test_config['sdk_path'])
+ # If android_home is None, test_config['sdk_path'] will be used and
+ # "$ANDROID_HOME" may be used if the env var is set up correctly.
+ # If android_home is not None, the path given from the command line
+ # will be directly written in the config.
+ if 'sdk_path' in test_config:
+ common.write_to_config(test_config, 'sdk_path', options.android_home)
else:
logging.warn('Looks like this is already an F-Droid repo, cowardly refusing to overwrite it...')
logging.info('Try running `fdroid init` in an empty directory.')
sys.exit()
- # try to find a working aapt, in all the recent possible paths
- build_tools = os.path.join(test_config['sdk_path'], 'build-tools')
- aaptdirs = []
- aaptdirs.append(os.path.join(build_tools, test_config['build_tools']))
- aaptdirs.append(build_tools)
- for f in os.listdir(build_tools):
- if os.path.isdir(os.path.join(build_tools, f)):
- aaptdirs.append(os.path.join(build_tools, f))
- for d in sorted(aaptdirs, reverse=True):
- if os.path.isfile(os.path.join(d, 'aapt')):
- aapt = os.path.join(d, 'aapt')
- break
- if os.path.isfile(aapt):
- dirname = os.path.basename(os.path.dirname(aapt))
- if dirname == 'build-tools':
- # this is the old layout, before versioned build-tools
- test_config['build_tools'] = ''
- else:
- test_config['build_tools'] = dirname
- write_to_config('build_tools', test_config['build_tools'])
- if not common.test_build_tools_exists(test_config):
- sys.exit(3)
+ if 'aapt' not in test_config or not os.path.isfile(test_config['aapt']):
+ # try to find a working aapt, in all the recent possible paths
+ build_tools = os.path.join(test_config['sdk_path'], 'build-tools')
+ aaptdirs = []
+ aaptdirs.append(os.path.join(build_tools, test_config['build_tools']))
+ aaptdirs.append(build_tools)
+ for f in os.listdir(build_tools):
+ if os.path.isdir(os.path.join(build_tools, f)):
+ aaptdirs.append(os.path.join(build_tools, f))
+ for d in sorted(aaptdirs, reverse=True):
+ if os.path.isfile(os.path.join(d, 'aapt')):
+ aapt = os.path.join(d, 'aapt')
+ break
+ if os.path.isfile(aapt):
+ dirname = os.path.basename(os.path.dirname(aapt))
+ if dirname == 'build-tools':
+ # this is the old layout, before versioned build-tools
+ test_config['build_tools'] = ''
+ else:
+ test_config['build_tools'] = dirname
+ common.write_to_config(test_config, 'build_tools')
+ common.ensure_build_tools_exists(test_config)
# now that we have a local config.py, read configuration...
config = common.read_config(options)
- # track down where the Android NDK is
- ndk_path = '/opt/android-ndk'
- if os.path.isdir(config['ndk_path']):
- ndk_path = config['ndk_path']
- elif 'ANDROID_NDK' in os.environ.keys():
- logging.info('using ANDROID_NDK')
- ndk_path = os.environ['ANDROID_NDK']
- if os.path.isdir(ndk_path):
- write_to_config('ndk_path', ndk_path)
- # the NDK is optional so we don't prompt the user for it if its not found
+ # the NDK is optional and there may be multiple versions of it, so it's
+ # left for the user to configure
# find or generate the keystore for the repo signing key. First try the
# path written in the default config.py. Then check if the user has
@@ -213,21 +183,21 @@ def main():
if not os.path.exists(keystore):
logging.info('"' + keystore
+ '" does not exist, creating a new keystore there.')
- write_to_config('keystore', keystore)
+ common.write_to_config(test_config, 'keystore', keystore)
repo_keyalias = None
if options.repo_keyalias:
repo_keyalias = options.repo_keyalias
- write_to_config('repo_keyalias', repo_keyalias)
+ common.write_to_config(test_config, 'repo_keyalias', repo_keyalias)
if options.distinguished_name:
keydname = options.distinguished_name
- write_to_config('keydname', keydname)
+ common.write_to_config(test_config, 'keydname', keydname)
if keystore == 'NONE': # we're using a smartcard
- write_to_config('repo_keyalias', '1') # seems to be the default
+ common.write_to_config(test_config, 'repo_keyalias', '1') # seems to be the default
disable_in_config('keypass', 'never used with smartcard')
- write_to_config('smartcardoptions',
- ('-storetype PKCS11 -providerName SunPKCS11-OpenSC '
- + '-providerClass sun.security.pkcs11.SunPKCS11 '
- + '-providerArg opensc-fdroid.cfg'))
+ common.write_to_config(test_config, 'smartcardoptions',
+ ('-storetype PKCS11 -providerName SunPKCS11-OpenSC '
+ + '-providerClass sun.security.pkcs11.SunPKCS11 '
+ + '-providerArg opensc-fdroid.cfg'))
# find opensc-pkcs11.so
if not os.path.exists('opensc-fdroid.cfg'):
if os.path.exists('/usr/lib/opensc-pkcs11.so'):
@@ -249,26 +219,24 @@ def main():
with open('opensc-fdroid.cfg', 'w') as f:
f.write(opensc_fdroid)
elif not os.path.exists(keystore):
- # no existing or specified keystore, generate the whole thing
- keystoredir = os.path.dirname(keystore)
- if not os.path.exists(keystoredir):
- os.makedirs(keystoredir, mode=0o700)
- password = genpassword()
- write_to_config('keystorepass', password)
- write_to_config('keypass', password)
- if options.repo_keyalias is None:
- repo_keyalias = socket.getfqdn()
- write_to_config('repo_keyalias', repo_keyalias)
- if not options.distinguished_name:
- keydname = 'CN=' + repo_keyalias + ', OU=F-Droid'
- write_to_config('keydname', keydname)
- genkey(keystore, repo_keyalias, password, keydname)
+ password = common.genpassword()
+ c = dict(test_config)
+ c['keystorepass'] = password
+ c['keypass'] = password
+ c['repo_keyalias'] = socket.getfqdn()
+ c['keydname'] = 'CN=' + c['repo_keyalias'] + ', OU=F-Droid'
+ common.write_to_config(test_config, 'keystorepass', password)
+ common.write_to_config(test_config, 'keypass', password)
+ common.write_to_config(test_config, 'repo_keyalias', c['repo_keyalias'])
+ common.write_to_config(test_config, 'keydname', c['keydname'])
+ common.genkeystore(c)
logging.info('Built repo based in "' + fdroiddir + '"')
logging.info('with this config:')
logging.info(' Android SDK:\t\t\t' + config['sdk_path'])
- logging.info(' Android SDK Build Tools:\t' + os.path.dirname(aapt))
- logging.info(' Android NDK (optional):\t' + ndk_path)
+ if aapt:
+ logging.info(' Android SDK Build Tools:\t' + os.path.dirname(aapt))
+ logging.info(' Android NDK r10e (optional):\t$ANDROID_NDK')
logging.info(' Keystore for signing key:\t' + keystore)
if repo_keyalias is not None:
logging.info(' Alias for key in store:\t' + repo_keyalias)
diff --git a/fdroidserver/install.py b/fdroidserver/install.py
index f6862e5a..a5cb98ad 100644
--- a/fdroidserver/install.py
+++ b/fdroidserver/install.py
@@ -25,14 +25,14 @@ from optparse import OptionParser, OptionError
import logging
import common
-from common import FDroidPopen, FDroidException
+from common import SdkToolsPopen, FDroidException
options = None
config = None
def devices():
- p = FDroidPopen([config['adb'], "devices"])
+ p = SdkToolsPopen(['adb', "devices"])
if p.returncode != 0:
raise FDroidException("An error occured when finding devices: %s" % p.output)
lines = p.output.splitlines()
@@ -103,7 +103,7 @@ def main():
logging.info("Installing %s..." % apk)
for dev in devs:
logging.info("Installing %s on %s..." % (apk, dev))
- p = FDroidPopen([config['adb'], "-s", dev, "install", apk])
+ p = SdkToolsPopen(['adb', "-s", dev, "install", apk])
fail = ""
for line in p.output.splitlines():
if line.startswith("Failure"):
diff --git a/fdroidserver/lint.py b/fdroidserver/lint.py
index d42efdf7..d8d6a962 100644
--- a/fdroidserver/lint.py
+++ b/fdroidserver/lint.py
@@ -22,108 +22,109 @@ import re
import logging
import common
import metadata
+import sys
from collections import Counter
+from sets import Set
config = None
options = None
+
+def enforce_https(domain):
+ return (re.compile(r'.*[^sS]://[^/]*' + re.escape(domain) + r'(/.*)?'),
+ domain + " URLs should always use https://")
+
+https_enforcings = [
+ enforce_https('github.com'),
+ enforce_https('gitlab.com'),
+ enforce_https('gitorious.org'),
+ enforce_https('apache.org'),
+ enforce_https('google.com'),
+ enforce_https('svn.code.sf.net'),
+ enforce_https('googlecode.com'),
+]
+
+
+def forbid_shortener(domain):
+ return (re.compile(r'https?://[^/]*' + re.escape(domain) + r'/.*'),
+ "URL shorteners should not be used")
+
+http_url_shorteners = [
+ forbid_shortener('goo.gl'),
+ forbid_shortener('t.co'),
+ forbid_shortener('ur1.ca'),
+]
+
+http_warnings = https_enforcings + http_url_shorteners + [
+ (re.compile(r'.*github\.com/[^/]+/[^/]+\.git'),
+ "Appending .git is not necessary"),
+ (re.compile(r'(.*/blob/master/|.*raw\.github.com/[^/]*/[^/]*/master/)'),
+ "Use /HEAD/ instead of /master/ to point at a file in the default branch"),
+ # TODO enable in August 2015, when Google Code goes read-only
+ # (re.compile(r'.*://code\.google\.com/.*'),
+ # "code.google.com will be soon switching down, perhaps the project moved to github.com?"),
+]
+
regex_warnings = {
- 'Web Site': [
- (re.compile(r'.*[^sS]://github\.com/.*'),
- "github URLs should always use https:// not http://"),
- (re.compile(r'.*[^sS]://code\.google\.com/.*'),
- "code.google.com URLs should always use https:// not http://"),
- ],
- 'Source Code': [
- (re.compile(r'.*[^sS]://github\.com/.*'),
- "github URLs should always use https:// (not http://, git://, or git@)"),
- (re.compile(r'.*code\.google\.com/p/[^/]+[/]*$'),
- "/source is missing"),
- (re.compile(r'.*[^sS]://code\.google\.com/.*'),
- "code.google.com URLs should always use https:// not http://"),
- (re.compile(r'.*[^sS]://dl\.google\.com/.*'),
- "dl.google.com URLs should always use https:// not http://"),
- (re.compile(r'.*[^sS]://gitorious\.org/.*'),
- "gitorious URLs should always use https:// (not http://, git://, or git@)"),
- ],
- 'Repo': [
- (re.compile(r'.*[^sS]://code\.google\.com/.*'),
- "code.google.com URLs should always use https:// not http://"),
- (re.compile(r'.*[^sS]://dl\.google\.com/.*'),
- "dl.google.com URLs should always use https:// not http://"),
- (re.compile(r'.*[^sS]://github\.com/.*'),
- "github URLs should always use https:// (not http://, git://, or git@)"),
- (re.compile(r'.*[^sS]://gitorious\.org/.*'),
- "gitorious URLs should always use https:// (not http://, git://, or git@)"),
- (re.compile(r'.*[^sS]://[^.]*\.googlecode\.com/svn/?.*'),
- "Google Code SVN URLs should always use https:// (not http:// or svn://)"),
- (re.compile(r'.*[^sS]://svn\.apache\.org/repos/?.*'),
- "Apache SVN URLs should always use https:// (not http:// or svn://)"),
- (re.compile(r'.*[^sS]://svn\.code\.sf\.net/.*'),
- "Sourceforge SVN URLs should always use https:// (not http:// or svn://)"),
- ],
- 'Issue Tracker': [
- (re.compile(r'.*code\.google\.com/p/[^/]+[/]*$'),
- "/issues is missing"),
- (re.compile(r'.*[^sS]://code\.google\.com/.*'),
- "code.google.com URLs should always use https:// not http://"),
+ 'Web Site': http_warnings + [
+ ],
+ 'Source Code': http_warnings + [
+ ],
+ 'Repo': https_enforcings + [
+ ],
+ 'Issue Tracker': http_warnings + [
(re.compile(r'.*github\.com/[^/]+/[^/]+[/]*$'),
"/issues is missing"),
- (re.compile(r'.*[^sS]://github\.com/.*'),
- "github URLs should always use https:// not http://"),
- (re.compile(r'.*[^sS]://gitorious\.org/.*'),
- "gitorious URLs should always use https:// not http://"),
- ],
+ ],
+ 'Donate': http_warnings + [
+ (re.compile(r'.*flattr\.com'),
+ "Flattr donation methods belong in the FlattrID flag"),
+ ],
+ 'Changelog': http_warnings + [
+ ],
'License': [
(re.compile(r'^(|None|Unknown)$'),
"No license specified"),
- ],
+ ],
'Summary': [
(re.compile(r'^$'),
"Summary yet to be filled"),
- ],
+ (re.compile(r'.*\b(free software|open source)\b.*', re.IGNORECASE),
+ "No need to specify that the app is Free Software"),
+ (re.compile(r'.*((your|for).*android|android.*(app|device|client|port|version))', re.IGNORECASE),
+ "No need to specify that the app is for Android"),
+ (re.compile(r'.*[a-z0-9][.!?]( |$)'),
+ "Punctuation should be avoided"),
+ ],
'Description': [
(re.compile(r'^No description available$'),
"Description yet to be filled"),
- (re.compile(r'[ ]*[*#][^ .]'),
+ (re.compile(r'\s*[*#][^ .]'),
"Invalid bulleted list"),
- (re.compile(r'^ '),
+ (re.compile(r'^\s'),
"Unnecessary leading space"),
- ],
+ (re.compile(r'.*\s$'),
+ "Unnecessary trailing space"),
+ ],
}
-regex_pedantic = {
- 'Web Site': [
- (re.compile(r'.*github\.com/[^/]+/[^/]+\.git'),
- "Appending .git is not necessary"),
- (re.compile(r'.*code\.google\.com/p/[^/]+/[^w]'),
- "Possible incorrect path appended to google code project site"),
- ],
- 'Source Code': [
- (re.compile(r'.*github\.com/[^/]+/[^/]+\.git'),
- "Appending .git is not necessary"),
- (re.compile(r'.*code\.google\.com/p/[^/]+/source/.*'),
- "/source is often enough on its own"),
- ],
- 'Repo': [
- (re.compile(r'^http://.*'),
- "use https:// if available"),
- (re.compile(r'^svn://.*'),
- "use https:// if available"),
- ],
- 'Issue Tracker': [
- (re.compile(r'.*code\.google\.com/p/[^/]+/issues/.*'),
- "/issues is often enough on its own"),
- (re.compile(r'.*github\.com/[^/]+/[^/]+/issues/.*'),
- "/issues is often enough on its own"),
- ],
- 'Summary': [
- (re.compile(r'.*\b(free software|open source)\b.*', re.IGNORECASE),
- "No need to specify that the app is Free Software"),
- (re.compile(r'.*[a-z0-9][.,!?][ $]'),
- "Punctuation should be avoided"),
- ],
- }
+categories = Set([
+ "Children",
+ "Development",
+ "Games",
+ "Internet",
+ "Multimedia",
+ "Navigation",
+ "Office",
+ "Phone & SMS",
+ "Reading",
+ "Science & Education",
+ "Security",
+ "System",
+ "Wallpaper",
+])
+
+desc_url = re.compile("[^[]\[([^ ]+)( |\]|$)")
def main():
@@ -142,18 +143,12 @@ def main():
print ' %s' % message
count['warn'] += 1
- def pwarn(message):
- if options.pedantic:
- warn(message)
-
# Parse command line...
parser = OptionParser(usage="Usage: %prog [options] [APPID [APPID ...]]")
parser.add_option("-v", "--verbose", action="store_true", default=False,
help="Spew out even more information than normal")
parser.add_option("-q", "--quiet", action="store_true", default=False,
help="Restrict output to warnings and errors")
- parser.add_option("-p", "--pedantic", action="store_true", default=False,
- help="Show pedantic warnings that might give false positives")
(options, args) = parser.parse_args()
config = common.read_config(options)
@@ -162,22 +157,36 @@ def main():
allapps = metadata.read_metadata(xref=False)
apps = common.read_app_args(args, allapps, False)
- for appid, app in apps.iteritems():
- curid = appid
- lastcommit = ''
+ filling_ucms = re.compile('^(Tags.*|RepoManifest.*)')
+ for appid, app in apps.iteritems():
if app['Disabled']:
continue
+ curid = appid
+ count['app_total'] += 1
+
+ # enabled_builds = 0
+ lowest_vercode = -1
+ curbuild = None
for build in app['builds']:
- if build['commit'] and not build['disable']:
- lastcommit = build['commit']
+ if not build['disable']:
+ # enabled_builds += 1
+ vercode = int(build['vercode'])
+ if lowest_vercode == -1 or vercode < lowest_vercode:
+ lowest_vercode = vercode
+ if not curbuild or int(build['vercode']) > int(curbuild['vercode']):
+ curbuild = build
- # Potentially incorrect UCM
- if (app['Update Check Mode'] == 'RepoManifest' and
- any(s in lastcommit for s in '.,_-/')):
- pwarn("Last used commit '%s' looks like a tag, but Update Check Mode is '%s'" % (
- lastcommit, app['Update Check Mode']))
+ # Incorrect UCM
+ if (curbuild and curbuild['commit']
+ and app['Update Check Mode'] == 'RepoManifest'
+ and not curbuild['commit'].startswith('unknown')
+ and curbuild['vercode'] == app['Current Version Code']
+ and not curbuild['forcevercode']
+ and any(s in curbuild['commit'] for s in '.,_-/')):
+ warn("Last used commit '%s' looks like a tag, but Update Check Mode is '%s'" % (
+ curbuild['commit'], app['Update Check Mode']))
# Summary size limit
summ_chars = len(app['Summary'])
@@ -189,42 +198,90 @@ def main():
if app['Web Site'] and app['Source Code']:
if app['Web Site'].lower() == app['Source Code'].lower():
warn("Website '%s' is just the app's source code link" % app['Web Site'])
- app['Web Site'] = ''
+
+ if filling_ucms.match(app['Update Check Mode']):
+ if all(app[f] == metadata.app_defaults[f] for f in [
+ 'Auto Name',
+ 'Current Version',
+ 'Current Version Code',
+ ]):
+ warn("UCM is set but it looks like checkupdates hasn't been run yet")
+
+ if app['Update Check Name'] == appid:
+ warn("Update Check Name is set to the known app id - it can be removed")
+
+ cvc = int(app['Current Version Code'])
+ if cvc > 0 and cvc < lowest_vercode:
+ warn("Current Version Code is lower than any enabled build")
+
+ # Missing or incorrect categories
+ if not app['Categories']:
+ warn("Categories are not set")
+ for categ in app['Categories']:
+ if categ not in categories:
+ warn("Category '%s' is not valid" % categ)
+
+ if app['Name'] and app['Name'] == app['Auto Name']:
+ warn("Name '%s' is just the auto name" % app['Name'])
name = app['Name'] or app['Auto Name']
if app['Summary'] and name:
if app['Summary'].lower() == name.lower():
warn("Summary '%s' is just the app's name" % app['Summary'])
- if app['Summary'] and app['Description'] and len(app['Description']) == 1:
- if app['Summary'].lower() == app['Description'][0].lower():
+ desc = app['Description']
+ if app['Summary'] and desc and len(desc) == 1:
+ if app['Summary'].lower() == desc[0].lower():
warn("Description '%s' is just the app's summary" % app['Summary'])
# Description size limit
- desc_chars = sum(len(l) for l in app['Description'])
- if desc_chars > config['char_limits']['Description']:
+ desc_charcount = sum(len(l) for l in desc)
+ if desc_charcount > config['char_limits']['Description']:
warn("Description of length %s is over the %i char limit" % (
- desc_chars, config['char_limits']['Description']))
+ desc_charcount, config['char_limits']['Description']))
+
+ if (not desc[0] or not desc[-1]
+ or any(not desc[l - 1] and not desc[l] for l in range(1, len(desc)))):
+ warn("Description has an extra empty line")
+
+ # Check for lists using the wrong characters
+ validchars = ['*', '#']
+ lchar = ''
+ lcount = 0
+ for l in app['Description']:
+ if len(l) < 1:
+ continue
+
+ for um in desc_url.finditer(l):
+ url = um.group(1)
+ for m, r in http_warnings:
+ if m.match(url):
+ warn("URL '%s' in Description: %s" % (url, r))
+
+ c = l.decode('utf-8')[0]
+ if c == lchar:
+ lcount += 1
+ if lcount > 3 and lchar not in validchars:
+ warn("Description has a list (%s) but it isn't bulleted (*) nor numbered (#)" % lchar)
+ break
+ else:
+ lchar = c
+ lcount = 1
# Regex checks in all kinds of fields
for f in regex_warnings:
for m, r in regex_warnings[f]:
- t = metadata.metafieldtype(f)
- if t == 'string':
- if m.match(app[f]):
- warn("%s '%s': %s" % (f, app[f], r))
- elif t == 'multiline':
- for l in app[f]:
+ v = app[f]
+ if type(v) == str:
+ if v is None:
+ continue
+ if m.match(v):
+ warn("%s '%s': %s" % (f, v, r))
+ elif type(v) == list:
+ for l in v:
if m.match(l):
warn("%s at line '%s': %s" % (f, l, r))
- # Regex pedantic checks in all kinds of fields
- if options.pedantic:
- for f in regex_pedantic:
- for m, r in regex_pedantic[f]:
- if m.match(app[f]):
- warn("%s '%s': %s" % (f, app[f], r))
-
# Build warnings
for build in app['builds']:
if build['disable']:
@@ -238,18 +295,14 @@ def main():
if ref.startswith(s):
warn("Branch '%s' used as commit in srclib '%s'" % (
s, srclib))
- for s in ['git clone', 'git svn clone', 'svn checkout', 'svn co', 'hg clone']:
- for flag in ['init', 'prebuild', 'build']:
- if not build[flag]:
- continue
- if s in build[flag]:
- # TODO: This should not be pedantic!
- pwarn("'%s' used in %s '%s'" % (s, flag, build[flag]))
if not curid:
print
- logging.info("Found a total of %i warnings in %i apps." % (count['warn'], count['app']))
+ logging.info("Found a total of %i warnings in %i apps out of %i total." % (
+ count['warn'], count['app'], count['app_total']))
+
+ sys.exit(1 if count['warn'] > 0 else 0)
if __name__ == "__main__":
main()
diff --git a/fdroidserver/metadata.py b/fdroidserver/metadata.py
index 77ee09c8..ae20c472 100644
--- a/fdroidserver/metadata.py
+++ b/fdroidserver/metadata.py
@@ -25,10 +25,13 @@ import logging
from collections import OrderedDict
+import common
+
srclibs = None
class MetaDataException(Exception):
+
def __init__(self, value):
self.value = value
@@ -45,6 +48,7 @@ app_defaults = OrderedDict([
('Web Site', ''),
('Source Code', ''),
('Issue Tracker', ''),
+ ('Changelog', ''),
('Donate', None),
('FlattrID', None),
('Bitcoin', None),
@@ -57,6 +61,7 @@ app_defaults = OrderedDict([
('Requires Root', False),
('Repo Type', ''),
('Repo', ''),
+ ('Binaries', None),
('Maintainer Notes', []),
('Archive Policy', None),
('Auto Update Mode', 'None'),
@@ -68,7 +73,7 @@ app_defaults = OrderedDict([
('Current Version', ''),
('Current Version Code', '0'),
('No Source Since', ''),
- ])
+])
# In the order in which they are laid out on files
@@ -98,10 +103,12 @@ flag_defaults = OrderedDict([
('scandelete', []),
('build', ''),
('buildjni', []),
+ ('ndk', 'r10e'), # defaults to latest
('preassemble', []),
- ('antcommand', None),
+ ('gradleprops', []),
+ ('antcommands', None),
('novcheck', False),
- ])
+])
# Designates a metadata field type and checks that it matches
@@ -164,7 +171,7 @@ valuetypes = {
FieldValidator("HTTP link",
r'^http[s]?://', None,
- ["Web Site", "Source Code", "Issue Tracker", "Donate"], []),
+ ["Web Site", "Source Code", "Issue Tracker", "Changelog", "Donate"], []),
FieldValidator("Bitcoin address",
r'^[a-zA-Z0-9]{27,34}$', None,
@@ -197,6 +204,11 @@ valuetypes = {
["Repo Type"],
[]),
+ FieldValidator("Binaries",
+ r'^http[s]?://', None,
+ ["Binaries"],
+ []),
+
FieldValidator("Archive Policy",
r'^[0-9]+ versions$', None,
["Archive Policy"],
@@ -216,7 +228,7 @@ valuetypes = {
r"^(Tags|Tags .+|RepoManifest|RepoManifest/.+|RepoTrunk|HTTP|Static|None)$", None,
["Update Check Mode"],
[])
- }
+}
# Check an app's metadata information for integrity errors
@@ -231,7 +243,7 @@ def check_metadata(info):
# Formatter for descriptions. Create an instance, and call parseline() with
# each line of the description source from the metadata. At the end, call
-# end() and then text_plain, text_wiki and text_html will contain the result.
+# end() and then text_wiki and text_html will contain the result.
class DescriptionFormatter:
stNONE = 0
stPARA = 1
@@ -240,7 +252,6 @@ class DescriptionFormatter:
bold = False
ital = False
state = stNONE
- text_plain = ''
text_wiki = ''
text_html = ''
linkResolver = None
@@ -259,7 +270,6 @@ class DescriptionFormatter:
self.endol()
def endpara(self):
- self.text_plain += '\n'
self.text_html += '
'
self.state = self.stNONE
@@ -339,7 +349,6 @@ class DescriptionFormatter:
def addtext(self, txt):
p, h = self.linkify(txt)
- self.text_plain += p
self.text_html += h
def parseline(self, line):
@@ -352,7 +361,6 @@ class DescriptionFormatter:
self.text_html += '