Skip to main content

PGP: C756 2813 F881 06E2 6F1F 547B 003F 530D 3859 B702

Learn more about me. Email me at







Keto Burgers for Dinner πŸ₯šπŸ₯“

Keto Burgers for Dinner πŸ₯šπŸ₯“
Keto Burgers for Dinner πŸ₯šπŸ₯“

Very successful dinner! The burgers were seasoned and cooked sous vide at 130 F for three hours. I made some bacon, and used the rendered fat to cook down onions and garlic. Then, I cooked some egg yolks confit in truffle scented oil using a sous vide bath at 150 F for 1.5 hours. Finally, I seared the burgers in bacon fat, layered on the bacon and onions, followed by some smoked Gouda, and put them under the broiler. I served the burgers topped with the egg yolks.


LaCour family cookout in Atlanta

LaCour family cookout in Atlanta
LaCour family cookout in Atlanta
LaCour family cookout in Atlanta
LaCour family cookout in Atlanta
LaCour family cookout in Atlanta

Fun visitΒ with family at my sister's lovely home in Decatur. Hot dogs, hamburgers, s'mores, and cousins!


Thanks to W Insider Sara at The W San Francisco ❀️

Thanks to W Insider Sara at The W San Francisco ❀️
Thanks to W Insider Sara at The W San Francisco ❀️

What a nice welcome to the hotel!Β We had a fun email exchange before my arrival, and she surprised me with some treats in my room.Β 


Going Serverless with Python WSGI Apps

1 min read

I've been writing web applications and services in Python since the late 1990s, and enjoy it so much that I created the Pecan web application framework way back in 2010. Configuring and deploying Python web applications, especially WSGI compliant applications, is fairly straightforward, with great WSGI servers like Gunicorn and uWSGI, and excellent Apache integration via mod_wsgi. But, for many use cases, creating and maintaining one or more cloud servers creates unnecessary cost and complexity. Security patches, kernel upgrades, SSL certificate management, and more, can be a real burden.

Since the creation of AWS Lambda, "serverless" has become a pretty popular buzzword. Could Lambda provide a way to deploy Python WSGI applications that helps reduce cost, complexity, and management overhead?

It was a fun topic to explore, and I've published a blog post over at about running Python WSGI apps on Lambda!


Recipe: Keto-Friendly Instant Pot Thai Chicken Curry


  • 1 stalk fresh lemongrass, trimmed down, and grated
  • 4 cloves garlic, grated
  • 1 small piece of peeled ginger, grated
  • 2 tablespoons fish sauce
  • 3 tablespoons coconut aminos
  • 2 cups full-fat coconut milk
  • 6 boneless, skinless chicken thighs
  • 1 teaspoon kosher salt
  • Β½ teaspoon freshly ground black pepper
  • 1 teaspoon ghee
  • 1 large onion, chopped
  • 4 tablespoons red curry paste
  • 1 kabocha squash, peeled, seeded, and cubed into large chunks
  • 1 bag riced cauliflower

Takes . Serves Six servings.

Earlier this week, I had a strong craving for Thai red curry. Because I am currently on the Keto diet, I have to be extra careful about ordering from restaurants, who often use added sugars, or cabohydrate-laden thickeners when preparing their meals. That meant preparing something on my own. I was short on time, so I wanted something that could be prepared quickly, which inspired me to use my Instant Pot electric pressure cooker. I found a good starter recipe on Nom Nom Paleo, and revised it a bit to end up with this recipe.

Set the Instant Pot to sautΓ© and add the ghee. Season the chicken thighs with kosher salt and freshly ground black pepper, then brown them thoroughly on both sides in batches. Don't overcrowd the pot, or the thighs will steam.

Remove the chicken, and add the onions to the pot, along with additional salt. Sauce until onions are translucent, and then add the coconut aminos and fish sauce, using the liquid to help scrape up the fond from the bottom of the pan. Add the garlic, ginger, and lemon grass, and cook for 1 minute. At this point, add the curry paste, and stir to coat the vegetables. Things should smell quite nice at this point!

Add the kabocha squash and cononut milk, and stir to combine. Then, nestle the browned chicken into the liquid, seal the Instant Pot, and set to 15 minutes on high pressure. When finished cooking, quick release the Instant Pot, remove the chicken, and chop it into large bite sized pieces. Add the chicken (and juices) back to the pot, stirring to combine. Serve over lightly steamed cauliflower rice.



Freeing Myself from Facebook

5 min read

Ever since my discovery of the IndieWeb movement, I've wanted to free myself from Facebook (and Instagram) and their brand of surveillance capitalism. I want to own my own data, and be in control of how it is shared, and I don't want it to be used for advertising.

I've had this incarnation of a personal website for a few years, and have mostly been following the POSSE publishing model, publishing most forms of content on my website, and then automatically (or manually) syndicating that content to silos like Facebook and Twitter. But, much of my content still remains trapped inside of Facebook and Instagram.

Until now.

As of March 4, 2018, I've pulled the vast majority of my Facebook content into my website, and all of my Instagram photos into my website, paving the way for me to delete myself from Facebook (and potentially Instagram) by the end of 2018. What follows is a high-level overview of how I made the move.


Exporting Data from Facebook

While Facebook does offer an export feature, its extremely limited, only includes very low resolution versions of your photos, and is generally very difficult to process programmatically. After some research, I discovered the excellent fb-export project on GitHub. Once installed, this tool will dump a huge amount (though, not quite all) of your Facebook data into machine-readable JSON files.

Since my website is compatible with the Micropub publishing standard, I then needed to convert this Facebook-native JSON data into microformats2 formatted JSON. Enter granary, an amazing swiss-army knife of IndieWeb by Ryan Barrett. Using granary, I whipped up a quick script that transforms the exported data into native microformats2 formatted JSON:

Publishing Liberated Data

At this point, I had a directory full of data ready to publish. Sort of. Unfortunately, not all of the data is easily translatable, or even desirable, to publish to my website. As a result, I created another script that let me, on a case by case basis, publish a piece of content, choose to skip it entirely, or save it to deal with later.

After running this script, I had a significant amount of my data copied from Facebook to my website. Huzzah!

Dealing with Photo Albums

Facebook has a "photo albums" feature, and I definitely wanted to get those memories onto my website. Again, I wrote a script that processes the exported data, and selectively allows me to upload all of the photos in an album to my website via Micropub, and then drops microformats2 JSON out that I could publish later.

Once I finished processing and uploading all of the photos for the albums I wished to copy over, I ran a simple utility script I keep around to publish all of the albums as new posts to my website.

Here are some of the results:

Notice, one of these comes all the way back from 2009!

Almost There

There are still quite a few photos and other types of posts that I haven't yet been able to figure out how to migrate. Notably, Facebook has strange special albums such as "iOS Uploads," "Mobile Uploads," and "iPhoto Uploads" that represent how the photos were uploaded, not so much a group of related photos. Unfortunately, the data contained in the export produced by fb-export isn't quite adequate to deal with these yet.

Still, I am quite pleased with my progress so far. Time to move on to Instagram!


Instagram has been slowly deteriorating as a service for years, so much so that I decided to completely stop publishing to Instagram earlier this year. It turns out, dealing with Instagram is a lot easier than Facebook when it comes to liberating your data.

Downloading My Data

After some research, I found instaLooter on GitHub, which allowed me to quickly export every single photo in its original resolution, along with nearly every bit of data I needed... except the photo captions. I ran instaLooter, and embedded the unique identifier in the filenames (which instaLooter refers to as the "code').

Getting Metadata and Publishing

I wrote a script that used granary to lookup the photo metadata and publish to my website via Micropub:

Note, I used the non-JSON form of Micropub in this case, because Known's Micropub implementation doesn't properly handle JSON for photos yet.


It turns out, that with a little knowhow, and a lot of persistence, you can liberate much of your data from Facebook and Instagram. I feel well on target to my goal of leaving Facebook (and maybe Instagram) entirely.