Inspired by World Backup Day, I decided to take a backup of my laptop. Thanks to using a free operating system I don't have to backup any of my software, just configuration and data files, which fit on a single DVD.
In order to avoid worrying too much about secure storage and disposal of these backups, I have decided to encrypt them using a standard encrypted loopback filesystem.
(Feel free to leave a comment if you can suggest an easier way of doing this.)
Cryptmount setup
Install cryptmount:
apt-get install cryptmount
and setup two encrypted mount points in /etc/cryptmount/cmtab
:
backup {
dev=/backup.dat
dir=/backup
fstype=ext4
mountoptions=defaults,noatime
keyfile=/backup.key
keyhash=sha512
keycipher=aes-xts-plain64
keyformat=builtin
cipher=aes-xts-plain64
}
testbackup {
dev=/media/cdrom/backup.dat
dir=/backup
fstype=ext4
mountoptions=defaults,noatime,ro,noload
keyfile=/media/cdrom/backup.key
keyhash=sha512
keycipher=aes-xts-plain64
keyformat=builtin
cipher=aes-xts-plain64
}
Initialize the encrypted filesystem
Make sure you have at least 4.3 GB of free disk space on /
and then run:
mkdir /backup
dd if=/dev/zero of=/backup.dat bs=1M count=4096
cryptmount --generate-key 32 backup
cryptmount --prepare backup
mkfs.ext4 -m 0 /dev/mapper/backup
cryptmount --release backup
Alternatively, if you're using a double-layer DVD then use this dd
line:
dd if=/dev/zero of=/backup.dat bs=1M count=8000
Burn the data to a DVD
Mount the newly created partition:
cryptmount backup
and then copy the files you want to /backup/
before unmounting that partition:
cryptmount -u backup
Finally, use your favourite DVD-burning program to burn these files:
/backup.dat
/backup.key
/etc/cryptmount/cmtab
Test your backup
Before deleting these two files, test the DVD you've just burned by mounting it:
mount /cdrom
cryptmount testbackup
and looking at a random sampling of the files contained in /backup
.
Once you are satisfied that your backup is fine, umount the DVD:
cryptmount -u testbackup
umount /cdrom
and remove the temporary files:
rm /backup.dat /backup.key
Plupload is a reusable component which fully takes advantage of the file upload capabilities of your browser and its plugins. It will use HTML5, Flash and Google Gears if they are available, but otherwise, it can gracefully degrade down to HTML4 it it needs to. Here's how it can be integrated within a Django web application.
(I have posted the sample application I will refer to and you may use it anyway you like.)
Creating a basic upload form
The first step is to create a simple one-file upload form that will be used in the case where Javascript is disabled:
class UploadForm(forms.Form):
file = forms.FileField()
def save(self, uploaded_file):
print 'File "%s" would presumably be saved to disk now.' % uploaded_file
pass
Then you can add this form to one of your templates:
<form enctype="multipart/form-data" action="{% url plupload_sample.upload.views.upload_file %}" method="post">
{% csrf_token %}
<div id="uploader">
{{form.file.errors}}{{form.file}}
<input type="submit" value="Upload" />
</div>
</form>
And create a new method to receive the form data:
@csrf_protect
def upload_file(request):
if request.method == 'POST':
form = UploadForm(request.POST, request.FILES)
if form.is_valid():
uploaded_file = request.FILES['file']
form.save(uploaded_file)
return HttpResponseRedirect(reverse('plupload_sample.upload.views.upload_file'))
else:
form = UploadForm()
return render_to_response('upload_file.html', {'form': form}, context_instance=RequestContext(request))
Adding Plupload to the template
In order to display the right Javascript-based upload form, add the following code, based on the official example, to the head of your template:
<link rel="stylesheet" href="/css/plupload.queue.css" type="text/css">
<script type="text/javascript" src="/js/jquery.min.js"></script>
<script type="text/javascript" src="/js/plupload.full.min.js"></script>
<script type="text/javascript" src="/js/jquery.plupload.queue.min.js"></script>
<script type="text/javascript">
$(function() {
$("#uploader").pluploadQueue({
runtimes : 'html5,html4',
url : '{% url plupload_sample.upload.views.upload_file %}',
max_file_size : '1mb',
chunk_size: '1mb',
unique_names : false,
multipart: true,
headers : {'X-Requested-With' : 'XMLHttpRequest', 'X-CSRFToken' : '{{csrf_token}}'},
});
$('form').submit(function(e) {
var uploader = $('#uploader').pluploadQueue();
// Validate number of uploaded files
if (uploader.total.uploaded == 0) {
// Files in queue upload them first
if (uploader.files.length > 0) {
// When all files are uploaded submit form
uploader.bind('UploadProgress', function() {
if (uploader.total.uploaded == uploader.files.length)
$('form').submit();
});
uploader.start();
} else {
alert('You must at least upload one file.');
}
e.preventDefault();
}
});
});
</script>
Pay close attention to the extra headers that need to be added to ensure that the AJAX requests will pass the Django CSRF checks:
X-Requested-With: XMLHttpRequest
X-CSRFToken: {{csrf_token}}
Adding Plupload to the view method
Now in order to properly receive the files uploaded by Plupload via AJAX calls, we need to revise our upload_file() method:
@csrf_protect def upload_file(request): if request.method == 'POST': form = UploadForm(request.POST, request.FILES) if form.is_valid(): uploaded_file = request.FILES['file'] form.save(uploaded_file) if request.is_ajax(): response = HttpResponse('{"jsonrpc" : "2.0", "result" : null, "id" : "id"}', mimetype='text/plain; charset=UTF-8') response['Expires'] = 'Mon, 1 Jan 2000 01:00:00 GMT' response['Cache-Control'] = 'no-store, no-cache, must-revalidate, post-check=0, pre-check=0' response['Pragma'] = 'no-cache' return response else: return HttpResponseRedirect(reverse('plupload_sample.upload.views.upload_file')) else: form = UploadForm() return render_to_response('upload_file.html', {'form': form}, context_instance=RequestContext(request))
The above includes the response (which is not really documented as far as I can tell) that needs to be sent back to Plupload to make sure it knows that the file has been received successfully:
{"jsonrpc" : "2.0", "result" : null, "id" : "id"}
Adding support for multipart files
Our solution so far works fine except when uploading large files that need to be sent in multiple chunks.
This involves writing to a temporary file until all parts have been received:
@csrf_protect def upload_file(request): if request.method == 'POST': form = UploadForm(request.POST, request.FILES) if form.is_valid(): uploaded_file = request.FILES['file'] chunk = request.REQUEST.get('chunk', '0') chunks = request.REQUEST.get('chunks', '0') name = request.REQUEST.get('name', '') if not name: name = uploaded_file.name temp_file = '/tmp/insecure.tmp' with open(temp_file, ('wb' if chunk == '0' else 'ab')) as f: for content in uploaded_file.chunks(): f.write(content) if int(chunk) + 1 >= int(chunks): form.save(temp_file, name) if request.is_ajax(): response = HttpResponse('{"jsonrpc" : "2.0", "result" : null, "id" : "id"}', mimetype='text/plain; charset=UTF-8') response['Expires'] = 'Mon, 1 Jan 2000 01:00:00 GMT' response['Cache-Control'] = 'no-store, no-cache, must-revalidate, post-check=0, pre-check=0' response['Pragma'] = 'no-cache' return response else: return HttpResponseRedirect(reverse('plupload_sample.upload.views.upload_file')) else: form = UploadForm() return render_to_response('upload_file.html', {'form': form}, context_instance=RequestContext(request))
Note that I have used /tmp/insecure.tmp
for brevity. In a real application, you do need to use a secure mechanism to create the temporary file or you would expose yourself to a tempfile vulnerability.
The Mahara project has just moved to mandatory code reviews for every commit that gets applied to core code.
Here is a description of how Gerrit Code Review, the peer-review system used by Android, was retrofitted into our existing git repository on Gitorious.
(If you want to know more about Gerrit, listen to this FLOSS Weekly interview.)
Replacing existing Gitorious committers with a robot
The first thing to do was to log into Gitorious and remove commit rights from everyone in the main repository. Then I created a new maharabot account with a password-less SSH key (stored under /home/gerrit/.ssh/
) and made that new account the sole committer.
This is to ensure that nobody pushes to the repository by mistake since all of these changes would be overwritten by Gerrit.
Basic Gerrit installation
After going through the installation instructions, I logged into the Gerrit admin interface and created a new "mahara" project.
I picked the "merge if necessary" submit action because "cherry-pick" would disable dependency tracking which is quite a handy feature.
Reverse proxy using Nginx
Since we wanted to offer Gerrit over HTTPS, I decided to run it behind an Nginx proxy. This is the Nginx configuration I ended up with:
server {
listen 443;
server_name reviews.mahara.org;
add_header Strict-Transport-Security max-age=15768000;
ssl on;
ssl_certificate /etc/ssl/certs/reviews.mahara.org.crt;
ssl_certificate_key /etc/ssl/certs/reviews.mahara.org.pem;
ssl_session_timeout 5m;
ssl_session_cache shared:SSL:1m;
ssl_protocols TLSv1;
ssl_ciphers HIGH:!ADH;
ssl_prefer_server_ciphers on;
location / {
proxy_pass http://127.0.0.1:8081;
proxy_set_header X-Forwarded-For $remote_addr;
proxy_set_header Host $host;
}
}
Things to note:
An HTTP to HTTPS redirection is not provided.
The HSTS headers indicates to modern browsers that this URL should only be accessed via HTTPS.
Only strong SSL ciphers are enabled.
Before proxying requests to Gerrit, Nginx adds a few headers to identify the origin of the request. The
Host
header in particular must be present otherwise built-in Gerrit redirections will fail.
Mail setup
To enable Gerrit to email reviewers and committers, I installed Postfix and used "reviews.mahara.org" as the "System mail name".
Then I added the following to /home/gerrit/mahara_reviews/etc/gerrit.config
:
[user]
email = "[email protected]"
to fix the From address in outgoing emails.
Init script and cron
Following the installation instructions, I created these symlinks:
ln -s /home/gerrit/mahara_reviews/bin/gerrit.sh /etc/init.d/gerrit
cd /etc/rc2.d && ln -s ../init.d/gerrit S19gerrit
cd /etc/rc3.d && ln -s ../init.d/gerrit S19gerrit
cd /etc/rc4.d && ln -s ../init.d/gerrit S19gerrit
cd /etc/rc5.d && ln -s ../init.d/gerrit S19gerrit
cd /etc/rc0.d && ln -s ../init.d/gerrit K21gerrit
cd /etc/rc1.d && ln -s ../init.d/gerrit K21gerrit
cd /etc/rc6.d && ln -s ../init.d/gerrit K21gerrit
and put the following settings into /etc/default/gerritcodereview
:
GERRIT_SITE=/home/gerrit/mahara_reviews
GERRIT_USER=gerrit
GERRIT_WAR=/home/gerrit/gerrit.war
to automatically start and stop Gerrit.
I also added a cron job in /etc/cron.d/gitcleanup
to ensure that the built-in git repository doesn't get bloated:
MAILTO=[email protected] 20 4 * * * gerrit GIT_DIR=/home/gerrit/mahara_reviews/git/mahara.git git gc --quiet
Configuration enhancements
To allow images in change requests to be displayed inside the browser, I marked them as safe in /home/gerrit/mahara_reviews/etc/gerrit.config
:
[mimetype "image/*"]
safe = true
Another thing I did to enhance the review experience was to enable the gitweb repository browser:
apt-get install gitweb
and to make checkouts faster by enabling anonymous Git access:
[gerrit]
canonicalGitUrl = git://reviews.mahara.org/git/
[download]
scheme = ssh
scheme = anon_http
scheme = anon_git
which requires that you have a git daemon running and listening on port 9418:
apt-get install git-daemon-run
ln -s /home/gerrit/mahara_reviews/git/mahara.git /var/cache/git/
touch /home/gerrit/mahara_reviews/git/mahara.git/git-daemon-export-ok
Finally, I included the Mahara branding in the header and footer of each page by providing valid XHTML fragments in /home/gerrit/mahara_reviews/etc/GerritSiteHeader.html
and GerritSiteFooter.html
.
Initial import and replication
Once Gerrit was fully working, I performed the initial code import by using my administrator account to push the exiting Gitorious branches to the internal git repository:
git remote add gerrit ssh://username@reviews.mahara.org:29418/mahara git push gerrit 1.2_STABLE git push gerrit 1.3_STABLE git push gerrit master
Note that I had to temporarily disable "Require Change IDs" in the project settings in order to import the old commits which didn't have these.
To replicate the internal Gerrit repository back to Gitorious, I created a new /home/gerrit/mahara_reviews/etc/replication.config
file:
[remote "gitorious"]
url = gitorious.org:mahara/${name}.git
push = +refs/heads/*:refs/heads/*
push = +refs/tags/*:refs/tags/*
(The ${name} variable is required even when you have a single project.)
Contributor instructions
This is how developers can get a working checkout of our code now:
git clone git://gitorious.org/mahara/mahara.git cd mahara git remote add gerrit ssh://username@reviews.mahara.org:29418/mahara git fetch gerrit scp -p -P 29418 reviews.mahara.org:hooks/commit-msg .git/hooks/
and this is how they can submit local changes to Gerrit:
git push gerrit HEAD:refs/for/master
Anybody can submit change requests or comment on them but make sure you do not have the Cookie Pie Firefox extension installed or you will be unable to log into Gerrit.