Setting up an Asus Flip C302CA Chromebook for R Development

Don’t think of this as a real blog post. It is mostly a loose collection of notes that I took while getting my new Asus Flip C302CA set up with R and RStudio. I’ve had this up and running for just a few days now (and in fact wrote this post using RStudio on it) and I love it. I highly recommend.

The steps below are not really tested. So if you run into problems or I have missed something, let me know.


  1. Enter Developer Mode
    • Esc - Refresh - Power
    • Follow directions
    • Takes a while (~30 minutes)
  2. Download crouton
  3. Add crouton integration extension
  4. Create chroot
    • Open crosh - ctrl-alt-T
    • Start bash - shell
    • Intall xfce xiwi extension touch
    • sudo sh ~/Download/crouton -e -t xfce,touch,xiwi,extension - It’ll ask for a new username and password - Since we are encrypting the chroot (with -e) it will also ask for a passphrase. I’m certainly not a security expert, but don’t use the same one as your google or new chroot password… - This takes a while (~15 minutes)
  5. You should now have a working ubuntu install with the xfce desktop available. Fire that up.
    • If you don’t have shell still open, get to that (ctrl-alt-T and shell)
    • type sudo startxfce4
    • Ta-da! Linux!
  6. Now we can start installing all the tools that we need from our xfce window.
    • get to a terminal
    • Install Git
sudo echo "deb xenial/" | sudo tee -a /etc/apt/sources.list
gpg --keyserver --recv-key E084DAB9
gpg -a --export E084DAB9 | sudo apt-key add -
sudo apt-get update
sudo apt-get install r-base r-base-dev
  • Install RStudio
    • I like to live on the edge so I usually have a fairly recent daily running. Here’s how you get that.
    • I also delete the .deb since this is on a chromebook. Space will likely be at a bit of a premium.
sudo dpkg -i rstudio-1.1.201-amd64.deb
rm rstudio-1.1.201-amd64.deb
  • The following are the notes I had for which libraries I added. My notes were a bit of a mess so this might not be all or may be too many.

Some of the basics (i.e. for devtools)

sudo apt-get install libxslt-dev libcurl4-openssl-dev libssl-dev

The spatial stuff. This also adds the ubuntugis repo so that you can get the latest and greatest. The latest is at

sudo add-apt-repository ppa:ubuntugis/ppa
sudo apt-get update
sudo apt-get install libgdal-dev libproj-dev
  1. Working with RStudio on your chromebook

Not a whole lot of details here. Just some basic notes I had for myslef. First, I am using a 64GB microSD card to give myself some room and I keep all of my projects stored on this card (as well as on GitHub). I just set up a symbolic link to this from my home folder. Something like the following should do the trick.

ln -s /var/host/media/removable/SD\ Card/ projects

With this you can get to the card easier (e.g. cd ~/projects)

I am still playing around with the best way to fire up rstudio. There are two ways I am doing this. Either firing up a separate desktop and using RStudio from there or starting RStudio in its own window. I think I prefer the later, but time will tell. You already know who to fire up the desktop. You can use rstudio from a terminal or find it in your applications menu. For the RStudio in its own window, I added this:

alias rstudio="sudo startxiwi rstudio -F"

to my ~/.bashrc in the chromebook (not the chroot!) shell. Then I can fire up rstudio with ctrl-alt-T, then shell, then rstudio.

Hopefully you should now be ready to roll with R and R development on your fancy new chromebook! See below for some additional links.

Some related links

Spatial Data Analysis in R: Lightning Demo!

At this years NEARC meeting I decided to give a lightning talk on using R as a GIS. As I was working on this I thought, “why not try a lightning demo?” That would be better than five minutes of slides on packages and commands. But, as anyone who has done a live demo will know, they often provide unexpected challenges. Add a 5 minute limit to that, and well, some level of failure is sure to occur. Becuase of this I have decided put everything into a web page so that the attendees (and others) can access the full demo at a later date. The full text and code is at

I finally got quickmapr on CRAN!

A little over 7 months ago I posted about a package that I had been working on, quickmapr. That was the pre-release version, the one that I finished up today and submitted to CRAN is a bit more polished, plots rasters a bit quicker, and is, I think, ready for wider release. It is now availble from CRAN.

The README on GitHub provides details plus some examples using a small dataset included with the package. I would be thrilled to get some feedback from people on the package, ease of use, suggestions for improvements, etc. I would be even more thrilled if you try it out on different datasets. Any thoughts just add them as issues.

Download Shapefiles - Take 2

So back in 2013 I posted a little function I wrote for grabbing all the relevant files that make up a shapefile from a URL. Turns out it doesn’t play so well with Windows 7 or Windows 8 (HT: John Lewis). Below is a reprised version that at least works on Ubuntu 14.04 and Windows 7. Haven’t tested it beyond that and supressing the warnings to get httr::GET to not complain too much about FTP seems a bit unclean. Well, you get what you pay for.

For all this to run you’ll need RCurl, httr, sp, and rgdal.

download_shp<-function (shape_url, layer, outfolder = ".") 
  if (length(grep("/$", shape_url)) == 0) {
    shape_url <- paste(shape_url, "/", sep = "")
  shapefile_ext <- c(".shp", ".shx", ".dbf", ".prj", ".sbn", 
                     ".sbx", ".shp.xml", ".fbn", ".fbx", ".ain", ".aih", ".ixs", 
                     ".mxs", ".atx", ".cpg")

  xlogic <- NULL
    xurl <- RCurl::getURL(shape_url)
    for (i in paste(layer, shapefile_ext, sep = "")) {
      xlogic <- c(xlogic, grepl(i, xurl))
  } else if(substr(shape_url,1,4)=="http"){
    for (i in paste(shape_url,layer, shapefile_ext, sep = "")) {
      xlogic <- c(xlogic,httr::HEAD(i)$status==200)
  shapefiles <- paste(shape_url, layer, shapefile_ext, 
                      sep = "")[xlogic]
  outfiles <- paste(outfolder, "/", layer, shapefile_ext, 
                    sep = "")[xlogic]
  if (sum(xlogic) > 0) {
    for (i in 1:length(shapefiles)) {
      x <- suppressWarnings(httr::GET(shapefiles[i], 
                                                       overwrite = TRUE)))
      dwnld_file <- strsplit(shapefiles[i], "/")[[1]]
      dwnld_file <- dwnld_file[length(dwnld_file)]
      print(paste0("Downloaded ", dwnld_file, " to ", 
                   outfiles[i], "."))
  else {
    stop("An Error has occured with the input URL or 
              name of shapefile")

And to see that it works again:

#Download the NH State Boundaries
## [1] "Downloaded NHSenateDists2012.shp to ./NHSenateDists2012.shp."
## [1] "Downloaded NHSenateDists2012.shx to ./NHSenateDists2012.shx."
## [1] "Downloaded NHSenateDists2012.dbf to ./NHSenateDists2012.dbf."
## [1] "Downloaded NHSenateDists2012.prj to ./NHSenateDists2012.prj."
## [1] "Downloaded NHSenateDists2012.sbn to ./NHSenateDists2012.sbn."
## [1] "Downloaded NHSenateDists2012.sbx to ./NHSenateDists2012.sbx."
#Read shapefiles in SpatialPolygonsDataFrame
## OGR data source with driver: ESRI Shapefile 
## Source: ".", layer: "NHSenateDists2012"
## with 24 features
#Plot it

plot of chunk run_it

quickmapr: An R package for mapping and interacting with spatial data

I do a lot of GIS. That used to mean firing up any one of the esri products, but over the course of the last couple of years I have done that less and less and instead fired up R.

When I first started using R for my spatial analysis work I often was left struggling with viewing the results of my analysis and could only do so with a clunky workflow of pushing my sp or raster objects out to shapefiles or tiffs and then pulling those into Arcmap. In short, spatial data visualization was severely lacking in R.

Fast forward to now, and that has all really started to change. Most of the work in this space has been on incorporating the slew of javascript tools (e.g. D3, leaflet, Crosslet) for visualizing spatial data. This has resulted in some really cool packages like:

These all result in great looking maps with nice interactivity; however, they all have two things in common. One, it is expected that your data are unprojected (i.e. Longitude and Latitude) and two that the data are simple text or in JSON (either GeoJSON or TopoJSON). This works for many use cases, but not for mine.

I usually start with small(ish) spatial data that are stored in GIS formats (e.g. shapefiles, esri rasters, file geodabase, etc.) and are projected. I use rgdal or raster to pull those into R and then do whatever it is I am doing to those and get sp and raster objects as output. At this point all I want to be able to do is quickly visualize the resultant data (usually less than 3 or 4 layers at a time), interact with that data by zooming, panning, and identifying values in the data interactively. I want to be able to this without having to convert to JSON or without having to un-project the data. The result of this desire is quickmapr.

With quickmapr you set up a qmap object by passing as many sp and raster objects as you’d like. There are some very basic controls on draw order and color. There are several zoom functions, a pan function, an identify function (which also returns the selected sp object or raster value), and a (currently very clunky) labeling function. This package is still a work in progress and I am hoping to keep working on quickmapr and tweaking how it works. I would love feedback so if you have thoughts, comments, complaints, etc don’t hesitate to leave some comments here, or better yet post issues on github, or fork the repo and make changes yourself. I will try and get up some contributing guidelines in the not too distant future.