Monthly Archives: August 2015

Can’t keep a good bathymetry figure down

Wowee we wow. It’s been a couple weeks of mad admin and bathymetry. It has consumed my consciousness. The team kelp jackets are coming along. A couple of snags here and there but there is enough momentum behind it now due to the many interested/ involved parties that it will drag itself forward inextricably from now on. Knock wood… A couple of diving misadventures were planned and fell flat, too, unfortunately. Haven’t blown bubbles for at least a couple of months now. Urrgg. The planning for the West Coast Trip started in earnest late last week and has almost finished itself already. So that is a positive result. There are a few bits that will require more direct pressure to come to fruition, so stay tuned on that. But a general quote on the cost exists. I hope SAEON is chipping in on this…

But let’s talk about bathymetry! Because who doesn’t love that topic?!? After having familiarised myself with the many different ways to plot bathymetry in R I created the figure posted previously showing a bathy plot and ribbon together for South Africa. I have since learned how to automate this process for anywhere in the world and have applied this code to the four EBUS cells. Namely the:  Benguela, Humboldt, Canary and California  current upwelling cells. I think these look pretty sweet. They still need international borders and some more tweaking.  But that can wait until the final polish is being applied before submission for publication. The ultimate goal is to combine the statistics that are behind these figures with measures of upwelling strength to author a paper on the interactions between shelf characteristics and upwelling strengths around the world. That is going to be tasty.

I probably won’t get my greedy mits on those data for a time however. So in the meantime I will return to the previous chapter as that is still hanging. The statisticians I consulted on my methodology have no strong opinions on the correctness of it… So I suppose that is as good as a green light… Everything has been computed, so I need to revisit the right up and polish the most recent graphical additions and then push it out the door. I have a list of potential journals and need to get over my avoidance at learning the publishing process. It is unfortunately a mental barrier for me.

Bathymetry III

After a couple of more half days plugging away at the issues I had failed to notice at the first pass of the bathymetry coding I finally sorted out the bugs and the new bathymetry figure is looking tasty. It could use some tweaking here and there but that will hold for another day.

For now I will focus on kelp research as all of my other projects are currently on hold. Waiting for my contacts to get back to me about this or that. I am hoping that I will see something in the kelp literature to give me an idea for the chapter (or two) in my thesis that must handle kelps, preferably in the context of climate change.

So ya, a pretty productive Tuesday all around.

Bathymetry again

The quest for a good bathymetry figure continues. Over the last couple of days I have put the aforementioned figure together but it still needs a lot of work. The top panel shows the bathymetry off the coast of SA in delightful shades of blue at a fairly high spatial resolution. This figure takes my computer almost a minute to compile. That’s pretty hefty. The bottom panel shows the shelf width as calculated by my shelf width function. The width values are geolocated correctly along the coast as can be seen by matching up the labelled cities on the upper panel with the labels on the x axis in the lower panel. One doesn’t need to look very closely to see that these values are far off from where they should be… It turns out my shore normal transects aren’t so normal after all. I need to go back to the drawing board and actually plot the shore normal transects to see what they are doing, because whatever it is it isn’t right.

So that is the next step in this process. Straighten out the transects. This means that all of the statistics I calculated previously were not correct either. So I will need to start over from the beginning. But all of the code that comes afterwards will still be usable so it’s not much of a set back really. The odd thing is that I used code I had previously written to create shore normal transects that had worked quite well, so it is strange that it isn’t working now. It’s probably just some small hiccup somewhere. Also, it would probably help if I smoothed out the coastline shapefile before calculating the shore normal transects. That will be the first step in the redo. And then I need to make a plan for when the shore normal transects are in a bay, and so don’t properly project out over the shelf edge. But that’s enough bathymetry for now. Going to do a bit more research on kelp to give the coding side of my brain a rest.

 

Bathymetry

The code for the bathymetry function I have had knocking around in my head has been brought into existence. It is still pretty rough, but it runs. It could probably be sped up a decent amount as running this on ~600 lines of data takes about 12 minutes… Not great. Besides that, it’s also completely dependent on a coastline shapefile and bathymetry .asc file saved on my computer. Meaning one can only calculate shelf width and and slope if it is within the boundaries of the southern African coast. The ideal situation would be for this to be able to run on any place in the world and would use downloading on demand to retrieve the bathy data. Or perhaps I could just get the bathy data for the whole world and save that as an .RData file… That could work actually. Hmm, something to look into.

Anyway, the code shows how to calculate the width of the continental shelf (calculated here as everything above -200m) at a certain point along the coast. This is done by drawing a shore normal transect from the point. Besides shelf width, this function also calculates how wide the shelf slope is (-200 to -1000m) and uses this to come to the angle of the continental shelf. This last bit I am not 100% sure is calculated correctly as the angles are exceedingly small. With the largest being only 0.2 degrees whereas the global average is 4.0 degrees. Though perhaps I am calculating this differently.


shelfWidthSlope <- function(site){
# Find the site on the coastline and it's nearest neighbour points
coords <- data.frame(lon = site$lon, lat = site$lat)
coords2 <- knnx.index(southern_africa_coast[,1:2], as.matrix(coords), k = 1)
coords3 <- data.frame(site = site$site, southern_africa_coast
<pre></pre>
) # This line controls how wide the spread of points used to determine the shore-normal transect is. A wider spread gives a more even, less sensitive transect
coords3 <- coords3[2:1,1:3]
# Define the shore normal transect bearing
heading <- earth.bear(coords3[1,2], coords3[1,3], coords3[2,2], coords3[2,3]) + 90
if(heading >= 360){
heading <- heading-360
} else {
heading <- heading
}
# Find the depths along this bearing for 300km
distances <- seq(from = 0, to = 300000, by = 1000)
distances2 <- destPoint(p = coords, b = heading, d = distances)
sitesIdx <- knnx.index(bathy[,1:2], as.matrix(distances2), k = 1)
bathy2 <- bathy[sitesIdx,]
bathy2 <- bathy2[complete.cases(bathy2$depth),]
if(length(bathy2$depth) < 2){
info <- data.frame(site = site$site, coords, shelf = NA, slope = NA, depth_diff = NA, angle = NA)
} else {
bathy3 <- data.frame(sapply(bathy2[1:2], deg2rad), depth = bathy2$depth)
# Find the point at which the depth first hits/ exceeds 200m
shelf_end <- bathy3[bathy3$depth <= -200,][1,]
# Calculate how far it takes to reach the 200m isobath
shelf_width <- gcd.hf(bathy3$lon[1], bathy3$lat[1], shelf_end$lon, shelf_end$lat)
# Find the point at which the depth first hits/ exceeds 1000m
slope_end <- bathy3[bathy3$depth <= -1000,][1,]
# Calculate how far it takes to reach the 1000m isobath
slope_width <- gcd.hf(shelf_end$lon, shelf_end$lat, slope_end$lon, slope_end$lat)
# From this calculate the shelf slope angle
slope_depth <- abs(slope_end$depth - shelf_end$depth)
slope_angle <- tan(slope_width/slope_depth)
# Put it all together
info <- data.frame(site = site$site, coords, shelf = shelf_width, slope = slope_width, depth_diff = slope_depth, angle = slope_angle)
}
return(info)
}

Side projects

It has been a week now with no word back from the statisticians. But I certainly haven’t been sitting still in the mean time. The figures I referenced in the previous post have been completed. In rough form for the moment, but the data and base code are there. When I have a better idea of how it will all tie together I will polish them off.

After completing as much of my chapter as I could I moved on to my list of side projects. First up was how to inset map figures within one another. I also gave my whole work flow for coastal and country border mapping a major overhaul. This really helped to improve the quality of my land/ sea based figures. This was then applied to one of the figures that AJ is using for his upcoming publication. It looks real nice. In between these mapping projects I have also been organising the Team Kelp jackets. The logos are all done. Just waiting to hear back on quotes from two more companies.

The current side project I am working on is another AJ idea. It is important to know how wide/ steep a section of continental shelf may be as this can have an effect on the local oceanography of an area. And this is a central aspect of the kelps and climate change question. So it is up to me to figure out how to calculate this in a very snazy way that can be applied to anywhere in the world. I have found a bathymetry database that can be accessed via a download function within an R script and so the first step is sorted. Now I just need to figure out how to load the .tiff file into R so that I can then calculate shore normal transects to find shelf widths and slopes for anywhere in the world all wrapped up in one tidy function. It’s all rather exciting really as no one has done this yet, for some reason.

Leaps and bounds

It has been a wild ride these past couple of weeks. After the CV PSE issue (see previous post) I spent a week looking deep in the literature to make a plan on how to move forward. I also contacted the author of the fishmethods package that contains the powertrend() function, who was little help. I found some very good papers that changed the focus and approach of my chapter in good ways, regardless of what happens with the variance issue. Ultimately I have settled on a new set of methods that are more appropriate to the question at hand. But then it was time to get ready for my CapeR user group talk.

So the previous week I got a start on that while simultaneously trying to finish up all the new methodology, figures and corrections to my poor limping beat up chapter. Now barely a semblance of its previous self. But will be stronger once the bandages are removed. Spending perhaps a bit too much time on the chapter, and not enough on my talk, I didn’t really finish my presentation until on the car ride to the venue at UCT. And even then it was something of a compromise. One can view the CapeR slides here. Ultimately it went well and I made some rather valuable contacts.

The most prevalent to this blog being a couple of statisticians, one of which was a consultant for the stats department at UCT. Very exciting. I have summed up my issue for her and am waiting to hear back from her and/ or her colleague as she sees if anyone has previously dealt with this problem. Best case they will get back to me by the end of the week with an answer about what the correct variance is, or at least set up a time to meet to go over the issue in more detail. Worst case the power analysis of trend will be all wrong and I’ll need to again redo/ re-plan the majority of it. But at least I have thought of a couple of clever new figures in the mean time. They still need a bit of polish and I’ll have that wrapped up in the next couple of days. If the statisticians don’t get back to me by then I will progress to the next chapter for now. Comparing all of the different SST time series to the in situ time series.