How to deploy your project pages through GitLab

Erik Huelsmann ehuelsmann at
Thu Dec 13 22:22:42 UTC 2018


On Thu, Dec 13, 2018 at 10:51 PM Raymond Toy <toy.raymond at> wrote:

> First, thanks for looking into this.

No problem. Thanks for the discussion. It sharpens the final solution.

> >> the public_html directory had directories holding other
> >> artifacts like the pdf and html versions of the user manual and other
> >> documentation and, more importantly, the binaries of the various
> >> snapshots and releases.
> >>
> >> How would this be handled?  I could check these items into the repo,
> >> but huge binary blobs in the repo that would just be copied out seems
> >> like not a good idea.
> >
> >
> > Correct. As in: adding the artifacts to a Git repository isn't really a
> great idea. I have 2 solutions: One using GitLab (preferred solution) and
> the other one based on the existing hosting setup on the filesystem.
> >
> > The preferred solution is to upload the files into GitLab using the
> following command:
> >
> >  $ curl --request POST --header "PRIVATE-TOKEN: <YOUR_PRIVATE_TOKEN>"
> --form
> "file=@/project/cmucl/public_html/downloads/release/21d/cmucl-src-21d.tar.bz2"
> >
> > You can find the "201" in the URL above on the project homepage (
> which lists the "Project
> ID". The private token can be created on the account profile page at
> (if you
> need to create on: you need to give it "API" scope; the others aren't
> required).
> >
> This adds a bit of burden to anyone who is creating binaries for
> cmucl.  I currently have a cron job that runs rsync to grab
> public_html to my local machine for my own backup purposes.  How would
> that work in this scenario?

It probably doesn't work too well: it's neither in the /project directories
nor in the same directory from which GitLab pages hosts your website.
Regardless, I can provide you with the paths to sync, but the mirror will
still point to the GitLab instance for the hosted binaries.

> >
> > The above is based on the info available from
> >
> > The other option is that we map /project/*/downloads to
>*/downloads where the default path
> /project/*/downloads is a symlink to /project/*/public_html/downloads in
> order to make sure that all content in existing "downloads" directories
> remains correctly served. This option isn't my preference because it still
> requires people to have SSH access, whereas the first option means we can
> simply re-use existing GitLab authentication mechanisms (from where I stand
> as an admin, that is; as a user, you need to create the private token you
> didn't need before).
> As someone who already has ssh access, this is easiest because it
> preserves what already on disk.  And the cmucl mirrors, this probably
> preserves most of existing behavior.
> But wouldn't we need to move public_html out of the way so gitlab
> pages can serve the pages?

Yes. There's no avoiding moving public_html out of the way.

> But if that's done, then you can't just
> point your browser to
> to see all of the files, right?

Well, I was planning to adjust the Apache configuration to pick up the
existence of /project/cmucl/downloads/ to serve files on the URL ; that way we can move
public_html, but keep what used to be served from public_html/downloads/.

> I do agree with you that using uploaded files to gitlab makes a lot of
> sense.  If I were starting from scratch without any existing mirrors
> and my rsync backup, that seem to be the way to go.

True. I think I'll document my preferred solution in the FAQ while you can
use the solution which depends on the slightly adjusted Apache config I
need to put in place. Would that slightly adjusted Apache config work for
you for the uploads while still depending on GitLab to build your site?


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the clo-devel mailing list