My First Book | LastPass Guide | Coming Soon

I’m writing a book! I started around July and figured it would take me between 6 and 12 months to complete. Turns out I made pretty good progress and will likely be finished in January or February. I plan to self-publish and sell it right here on b3n.org.

This is a book cover for my first book, LastPass Guide.  A Step by Step Guide to Managing Your Passwords.

The book is called LastPass Guide (although I’m testing other titles), it is a step-by-step guide to teach people how to use the LastPass Password Manager. I’ve helped many people with LastPass and I know where most get tripped up–I often wish there was a guide I could point people at and I finally decided to write one.

It is simple enough a non-technical person could pick it up and not only become proficient in using LastPass; but also have a good foundation of security best practices by the end. The book also covers security essentials: many that I’ve seen cyber-security experts overlook. I’ve had a few tech professionals review the book and tell me they’re changing their security practices as a result.

If you’re interested in getting updates on the progress feel free to sign up for my newsletter. You’ll also get a sample download from the book.

Book Progress and What’s Left

The truth is I’ve never self-published, or published anything other than this blog so I’m learning as I go. My to-do list is very different now than it was at the start. I’m also getting a lot of help and advice from books about self-publishing, and getting help from family and friends. I’ve even had Eli proof reading for me.

Progress (so far):

  1. [x] Read several books about writing books
  2. [x] Decide to sell on Amazon or Self-Publish (decided to self publish).
  3. [x] Write a first draft
  4. [x] Send a draft to my editors (family and friends) for feedback
  5. [x] Decide whether to get a new domain for the book or sell it on b3n.org (decided to sell it on b3n.org).
  6. [x] Pick a working title (“LastPass Guide”)
  7. [x] Inform LastPass’s marketing/legal team to make sure there won’t be an issue (just gotten crickets so far)
  8. [x] Design a book cover
  9. [x] Design a Coming Soon Landing Page
  10. [ ] Pick an eCommerce platform (leaning towards Gumroad or WooCommerce)
  11. [x] Review notes / advice from reviewers
  12. [ ] Second round of review / edits
  13. [ ] Run Google Ads A/B testing to test different titles (just started this yesterday).
  14. [ ] Determine Final Title
  15. [ ] Final Book Cover Design
  16. [ ] Third and “final” review / edits
  17. [ ] Photos
  18. [ ] Get testimonials (in progress)
  19. [ ] Setup eCommerce platform
  20. [ ] Build Better Landing Page
  21. [ ] Figure out how to use Facebook and Twitter to announce the launch, if I use those at all. May skip this since I’m not a huge fan of Facebook.
  22. [ ] Setup a discount and run some tests orders through to catch any issues
  23. [ ] Pre-Launch to email subscribers with Discount
  24. [ ] Remove Discount and Launch

Frequently Asked Questions

When will the book be released?

I’m targeting to release end of January or early February 2020.

Why didn’t you choose KeePass, Bitwarden, 1Password, [insert your favorite password manager here]?

LastPass is in a fairly unique position in that it is ubiquitous, fully featured, very well audited and monitored by security firms, has reasonably priced plans and security measures that make it acceptable for individuals, families, small businesses, and enterprises. Some reviewers have asked why I didn’t base the guide on KeePass. While KeePass may be more secure since it is offline, KeePass is missing four key features most people will want: A Dead Man’s Switch, Automatic Sync, Easy Browser Integration, and Sharing.

Can I get a discount?

During pre-launch we will have early release pricing for a few days before it is released to the masses… the exchange for the discount is I want you to be watching for problems in the ordering process and let me know if there’s an issue.

Are you planning to do coupon codes or future promotions?

No. While I am trying to learn some marketing strategies, I’m very much against marketing tactics designed to pressure people into buying before they’ve had a chance to think about it. Other than the initial launch I don’t see doing time-based promotions. I don’t ever want someone to buy a book at full price and then find out it’s on sale at half that price a day later.

Will there be an affiliate program?

Not at launch due to time constraints, but if there is interest we can do something post-launch. Probably at 50/50 revenue sharing. Shoot me an email if you’re interested.

Why Aren’t You Selling This on Amazon?

A couple of reasons:
1. I want buyers of the book to be my customers. When you sell on Amazon, buyers are not your customers. This is the main reason I chose to self-publish.
2. This book includes a lot of screenshots and graphics and Kindles are just awful at rendering those. I’m afraid it would get poor ratings just because of the way it formats on the Kindle platform. This book is much better as a PDF format where I have control of the formatting and design. This is not to say I’m not a fan of Kindles, this just isn’t the best book for it.

Will it just be an eBook or are you going to sell a paper version?

Just an eBook. That’s the best format for three reasons:
1. The thing with technology is things can change so I’d rather be able to send out updates as needed which you can’t do with a physical copy.
2. I’m not setup to do fulfillment. I’d have to charge something like $200 a book to make it worth the effort.
3. It’s easier to fix typos and mistakes with eBooks.

Aren’t you going to blog about some cool tech stuff soon?

Yes, several posts are in the works, including my first guest post.

Well, that’s all for now. Hopefully I’ll have a progress update in January.

Cloudways Managed WordPress Hosting

Save Time Managing WordPress

Last week I moved b3n.org from DigitalOcean to Cloudways Managed WordPress Hosting. Why? Well, there is nothing wrong with DigitalOcean, they’ve been fantastic.

But my problem is I hardly had time to maintain the technology stack. A few weeks ago I was in the process of adding a couple of WordPress sites. This isn’t difficult, but it’s tedious. You have to create user accounts, modify NGINX site files, setup SSL Cert Automation, configure Varnish and Redis for caching, install WordPress itself and set all that up for security, auto-updates, caching, etc. Then a year from now I’m going to have to migrate everything to a new host when Ubuntu 16.04 goes EOL (End of Life) for security updates. As I was working on this I thought to myself… What am I doing!?

Logos of Apache, PHP, MariaDB, redis, WordPress, Memcache, Varnish, NGINX, and Let's Encrypt

Before: On DigitalOcean I spent a lot of research and testing and setup plus several hours a month maintaining the OS, technology stack, security updates, and performance tuning necessary to run WordPress.

After: Now I host WordPress on Cloudways and they take care of it for me. When I want a new WordPress instance or to make a change I push a button on a web interface. Done.

What did that time savings cost me? It cost me dearly. My monthly hosting went from $5 to $10.

Before finding Cloudways I had a bit of a journey. I started by looking into hosting options… and decided I wanted managed hosting. This is mostly because I feel like I’ve done a much better job at tuning WordPress than shared hosting providers I’ve used in the past.

Managed Hosting vs. Shared Hosting

Managed hosting typically differs from shared hosting in the service level they offer. I say typically because many managed hosting providers fall short, and many shared hosting providers excel in these areas. But in general Managed Hosting providers are better at:

  • Automated backups
  • Multiple environments (Dev/Stage/Prod) and migration between them
  • Performance tuning
  • Caching and CDN
  • Security updates
  • Guaranteed or dedicated resources (cpu, memory, I/O, bandwidth)
  • Monitoring
  • Self-healing
  • Better control of when core components get upgraded (PHP, MySQL, MariaDB, etc.). This is useful because if you want to take advantage of the latest version of PHP like 7.3 you can, but if you have a plugin that isn’t compatible you can stay on an older version.

Managed Hosting Options

I had my shortlist. SiteGround, Bluehost, WPEngine, etc. Note that I am not looking at the cheaper shared hosting, but at their managed hosting plans.

All looked like they’d be great but what irked me is they want you to pre-pay for several years in advance to get the advertised price. I am used to hourly billing with DigitalOcean. The thing with technology is things change fast so I want flexibility. I don’t ever want to be locked into a situation where I’ve prepaid 2 years of hosting.

The other concern is the affordable plans had monthly visitor limits, bandwidth limits, or number of WordPress install limits. Most were under what b3n.org needs which would push me into the $100+ plans. Maybe my DigitalOcean droplet isn’t so bad after all!

So back to Google searching… I came across Cloudways. What’s the best way I can describe Cloudways? The DigitalOcean of WordPress.

What Separates Cloudways

What makes Cloudways unique is when you deploy WordPress, you’re not just getting a managed WordPress Application. You’re getting your own Cloud Server and you can install as many WordPress instances under it at no additional cost. So the hierarchy is:

  • Server
    • WordPress site 1
    • WordPress site 2
    • etc…

If you run out of capacity you can scale horizontally (deploy more servers) or vertically (more cores, memory, and ssd space).

Logos of DigitalOcean, Linode, Vultr, AWS, and Google Cloud Platform that Cloudways allows you to deploy to.

Cloudways doesn’t have their own infrastructure. Rather they partner with DigitalOcean, AWS, Google Cloud, Linode, and Vultr so you pick the underlying cloud vendor. So when you deploy a server on Cloudways you’re actually getting a managed cloud server.

Features I like from Cloudways

  • You can choose your desired cloud provider based on your needs.
  • Price is affordable ($10/month for a small DO droplet)
  • Per hour billing (no pre-paying years in advance).
  • Unlimited sites and WP instances, you can scale up as needed.
  • Choose any location you want
  • Staging Environments!
  • WordPress migration (mine migrated over flawlessly) from your old server
  • 24/7 Support … now when my server has trouble I don’t have to call myself.
  • Linux, Apache, NGINX, SSL Cert automation, Varnish, redis, security updates and all of that stuff I used to maintain myself is now taken care of for me! |:-)
  • Monitoring and Auto-healing can correct problems proactively.
  • There are a lot of checks for best practices and server health. I temporarily disabled the Breeze cache plugin and got an email the next day telling me it was still disabled. Similarly there are checks for load and performance.
  • You can choose which version of PHP and MariaDB to run on.
  • And now when Ubuntu 16.04 LTS goes EOL…. I don’t care!
  • WordPress Instances come pre-optimized (have Breeze caching plugin installed, Memcached, etc.).
  • It’s not limited to WordPress so Drupal and other PHP applications are supported as well.

Where Cloudways Could do Better

  • I’m a bit unclear what what happens when the server I deployed goes EOL for security updates. I can’t imagine they would upgrade it autonomously since that would be risky. I’m guessing it would fail a health check and I’d get a notification to upgrade? It’s something I’ll have to keep an eye on, but it could be made clearer. If the solution for this is to deploy a new server and move your WordPress Instance over to a new server that can be done with a few clicks from the web interface.
  • The Cloudways interface is not snappy. It can take a few seconds to bring up monitoring metrics.
  • Where are floating IPs?! With DigitalOcean I can get a floating IP that I can assign to one droplet and then reassign it to another droplet. With Cloudways it looks like moving to another server would require DNS changes.

Conclusion

In the the chart below I have:

  • IaaS (Infrastructure as a Service)
  • PaaS (Platform as a Service)
  • SaaS (Software as a Service)

Cloudways would fall in PaaS. They manage everything that WordPress runs on (PHP, MariaDB, Varnish, Apache, NGINX, etc.). Although they step in the SaaS world a bit since they will automatically deploy optimized WordPress instances for you with things like caching pre-configured, but for the most part you’re still managing WordPress yourself.

Chart showing IaaS, PaaS, and SaaS.  Cloudways falls under IaaS

All in all Cloudways Managed Cloud Hosting seems to be a decent offering. One side benefit is they’re just better at performance tuning than I am. On DigitalOcean where I was maintaining the platform myself b3n.org was able to handle a sustained load test of 150 clients per second, on Cloudways it handles over 1000 clients per second.

My First 3D Printer! Ender 3 Pro

Eli Assembling Ender 3 Pro

I’m not sure exactly how it started, it might have been when Eli and I were trying in vain to find Lego Technic sets with lots of gears, or when Kris was discussing with me purchasing learning aids for school. … and I started to realize we could 3D print this stuff!

Just with the things we buy for school each year a 3D printer will pay for itself in 2 years.

What is 3D Printing?

3D Printing is also known as Additive Manufacturing (AM). This means instead of injecting molding, items are created by printing layers on top of layers. Now injecting molding is fine for mass production, but for small quantities it doesn’t make since because molds aren’t cheap to make. For 3D printing a variety of methods and materials can be used. I use PLA (Polylactic acid), the plastic is fed to the printer and heated to the point of melting. It then comes out a nozzle where it is cooled and solidifies. The nozzle is controlled by X, Y, and Z axis stepper or servo motors allowing the nozzle to be positioned anywhere in the print area.

Octopus with articulating legs… the 3D printer can print the leg segments in place interlocked. I don’t think this is possible in traditional manufacturing.

Of course, I know very little about 3D printing so I turned to my coworker, Brad, who has designed and printed out prototype aircraft components and has actually flown them. I asked him for the best quality budget 3D printer. He has a few of the larger fancier Creality printers and told me the next one he would likely buy for himself for smaller prints was a little Creality Ender 3 Pro. One thing I’ve learned: if the expert is willing to buy something for himself, that’s what you want.

The Ender 3 Pro comes with all the tools needed for someone new to 3D printing. Allen keys, wrench, screwdriver, pliers, SD card and USB adapter, nozzle cleaner needle, blade, etc. The Pro version adds a few features that I think make it worth the extra cost of the normal Ender 3: It is a bit more sturdy, has a better (quieter) PSU, can resume printing after a power failure, and has a magnetic flexible print bed which eliminates the need for glue or hairspray to get prints to stick. The 3D prints adhere very well during printing and peels right off when done. I hardly ever need to print with rafts or support structures. I don’t even print a brim.

It arrived noon on Friday, Eli couldn’t wait so he and Kris mostly had it together before I got home. We finished the assembly, I didn’t level the bed or anything, I popped in the SD card that came with the machine, selected the cat model that was already on the card, and it started printing, and printing, and printing…. okay, so it took a long time. So we all all went to bed.

Next morning I woke up to hearing, “It finished!” We had a cat! Which Eli promptly painted. …here’s our first print:

Not bad for a first try.

For our second print we decided to print something simple like the Eiffel tower. I found one on ThingiVerse and opened it up in the Creality Slicer (a slicer is a program that converts 3D models into a gcode file that the printer understands) that came with the printer. It took me 3 tries.

This was my last print using the Creality Siicer. I had to go crazy on the rafts and support structure but this isn’t needed with the default Ender 3 Profiles that come with Cura.

Our first attempt ran for an hour or two then one of the 4 legs fell over. I tried it again with a raft but it still fell! Then I made huge rafts and a support structure and it worked! But the print came out stringy. I was using Creality Slicer since it came with the printer. Then I remembered Brad told me to try Cura. So I downloaded that… and it was a night and day difference (even though Creality Slicer is based on Cura). I told Cura what printer I had and it had pre-loaded sane defaults for everything from print speeds to head extraction. Now that I’m printing with Cura, I don’t need support structures, and no stringing. I’m guessing most of the difference was in the default profiles.

Print Workflow

Business Card Holder from ThingiVerse

What does a 3D print workflow look like?

1. Go to Thingiverse and search for a 3D object. Thingiverse is a huge library of 1,500,000 3D printable models. I’ve found everything from Craftsman versatrack compatible bike hangers to spare parts for my car. Download the STL file (this is essentially a CAD file).

2. Open the STL file with Cura (free open source) which is a slicer to convert the object into instructions the 3D printer can understand. Cura has well tuned default profiles for the Ender 3. The instructions will output into a .gcode file. I popped one open and it is a text file with line by line instructions for the printer to go to these x y z coordinates at these speeds at this temperature, etc. Essentially you copy this to an SD card, insert it into the printer, and print the object.

3. The printer will pre-heat the bed and PLA, then start printing. I would say we have a success 9 out of 10 times. Sometimes I won’t have the bed quite level or the temperature won’t be right for the specific PLA brand/color I’m using (even different colors print at different temperatures). But you can save those color profiles in Cura so once you have it dialed in it should work going forward. Generally if the first layer succeeds the print will be a success.

4. When done, let it cool for 30 seconds, bend the magnetic bed and the print peels off.

Can you Design and Print Your Own 3D Models? Yes!

Gears Eli designed in Tinkercad

Just about everyone has asked me this question. You can.

Tinkercad (by Autodesk) is a free web CAD designer that makes it simple to design 3D objects. The very first thing Eli designed in Tinkercad was a set of gears.

Stepper Motor Noise

Okay, so one issue I had with the Ender 3 Pro is the noise the servo motors make. The best way I can explain it is the printer sounds like R2D2 and C3PO are arm wrestling and you hear it throughout the entire house. I ended up swapping out the control board for one with silent stepper drivers. Once I did that, the only noise you hear is the fans. Much better. We have it near the kitchen and I’d say it isn’t silent. The fan is noisier than a typical computer fan but not nearly as loud as the dishwasher.

Motherboard

Infill Patterns

In Cura, you can choose from a number of Infill Patterns. Each has their advantage. Some are designed to be stronger, print faster, save on material. One of the huge advantages 3D printing has is you can pick a pattern and density to provide the strength needed for a particular use. This greatly reduces the amount of plastic needed to fill in a part. Here are the infill patterns in Cura:

Infill Patterns in Cura

Left to right the infill patterns are:

  • Gyroid
  • Cross 3D
  • Cross
  • Zig Zag
  • Concentric
  • Quater Cubic
  • Octet
  • Cubic-Subdivision
  • Cubic
  • Tri-Hexagon
  • Triangles
  • Lines
  • Grid
Infill Patterns Test
Cubic-Subdivision Infill Pattern

I usually use lines for quick prints. If I have a larger shape that needs to bear stress in multiple directions I’ll either use Gyroid (which is a 3D pattern found in creation) and or Cubic-Subdivision which will use more density around the perimeter and less in the middle (like bones).

Getting Started in 3D Printing

Here’s what I bought to get started.

  1. Creality Enter 3 Pro 3D Printer. 3D printer along with an essential set of tools.
  2. Silent Stepper Drivers Motherboard Upgrade
  3. Gizmo Dorks PLA Filament 1.75mm, 200g 4 Color Pack. I wanted to try a few different colors. These were easy to start with.
  4. Hatchbox PLA 1 kg spools in various colors.

One thing I’d say about 3D printing is at my budget it is not quite there when it comes to easy of use. There was nothing me, Kris, and Eli couldn’t figure out and get working, but it look us a bit of time to get the bed leveled and temperature settings dialed in. If you want something that is closer to “hit print and it just works” then you may want to pay a little more and get a Prusa Mini. It has auto-bed leveling and a network interface which makes it much more user-friendly. But you will pay quite a bit more for those features.

The Future

3D printing is the future. In the home it is going to replace the need to run to the store to get something small, and allow for 3D printing small parts to repair items instead of tossing them. Just like printers moved from businesses to homes, so will the ability to manufacture small plastic items. 3D Printing isn’t instant, but it’s already faster than Amazon Prime. And if it saves me from having to make a trip to Spokane to find some part it’s worth it.

For manufacturing, it greatly reduces the tooling costs. Injection molding will still be used for items produced in mass. But 3D printing lowers the tooling costs for smaller runs and one-off items. Not to mention the agility: a factory of general purpose 3D printers can instantly start printing something else to instantly meet new demands and market changes.

Transformer Pumpkin parts. I am amazed the printer can do those overhangs.

Things We’ve Printed (So far)…

  • Cat
  • Eiffel Tower
  • Pumpkin
  • Octopi to hand out as prizes
  • Pumpkin Transformer
  • UniFi USG and Switch mini racks
  • 3D Topography Maps of the 7 Summits
  • Craftsman Versatrack compatible bike hook
  • Drawers to store tools for the Ender 3
  • Impossible 3D shapes
  • Jig for drilling axles in a Pinewood Derby car
  • 3D Luther Roses for Reformation Day prizes
  • Business Card Holder
  • Gears
  • Benchy Boat
  • Lego compatible bricks
  • Carabiner

Mount St. Helens

Mount St. Helens
Mount St. Helens

Next May will be the 40th Anniversary of the Mount St. Helens Eruption which occurred on May 18th, 1980. At the time geologists knew very little about volcanoes or the possibility of a lateral blast… it killed 57 people, most in areas outside the restricted zone. It is the most devastating eruption to occur in U.S. History. There have been many volcanic eruptions, but this one happened in an area near modern western civilization so it was well studied and documented.

We got to meet Paul Taylor over at the Mount St. Helens Creation Center and he took us on several excursions.

If I wrote about everything we did there this post would be too long, so here are the highlights.

East Side Excursion

Tree killed by heat from the volcano blast
Tree killed by heat from the volcano blast

The green trees have all grown since the volcano. Following the landslide the pyroclastic flow blasted out at speeds up to 670 mph, knocking down 230 square miles of trees. But the tree above was far enough away it wasn’t knocked down. It was just killed from the heat.

Tree uprooted with main root ball intact
Tree uprooted with main root ball intact

All over the place… and I mean everywhere we can see trees with rootballs uprooted and torn from their roots having been knocked over by the blast. We saw miles and miles of devastation like this. One photo doesn’t do it justice.

Uprooted trees from Mt. St. Helens
More uprooted trees, notice the fallen trees in the background

In the picture below as I was looking at this from a distance I thought there was ice on the right-side of the lake. But looking through my binoculars it’s not ice! Those are logs! We took a hike to take a closer look…

Spirit Lake
Spirit Lake
Hike to Spirit Lake
Kris and Eli going down to Spirit Lake

Hiking down to Spirit Lake

Boy under a fallen tree
Eli under a fallen tree over the path

Family in front of floating logs on Spirit Lake
Ben (me), Kris, and Eli in front of the floating logs on Spirit Lake

The landslide off Mount St. Helens rushed into Spirit Lake at 110+ mph sloshing the water out of Spirit Lake onto a hill with thousands of trees. …which had just been sheared or uprooted many with root ball intact by the initial blast of superheated volcano ash and gas seconds earlier. The landslide rose the lake level by 200ft, then the water returned to the lake taking an avalanche of logs with it. These logs I took a photo of have been floating on the lake for nearly 40 years.

Floating Logs on Spirit Lake
Floating Logs on Spirit Lake

Sonar and divers confirm that many of the logs have sunk and are in various positions at the bottom of Spirit Lake. Since logs sunk at different times they are buried in various layers of sediment.. and they’re spread out all over the lake as if they were a forest. All of this from one event nearly 40 years ago. You’ll notice that this looks very similar to the Yellowstone Petrified Forest where trees are found in different layers of sediment, often with the root ball attached but no roots.

Trees shown in different layers at Mt. St. Helens Spirit Lake
Trees sunk to the bottom and sit vertically. As more sediment accumulates the trees could be mistaken for growing in place at different time periods for different layers, but we know none of the trees grew here. Graphic Credit: Theresa Valentine / US Forest Service

What’s happening in Spirit Lake doesn’t fit the evolutionary narrative of millions of years. This Yellowstone Park article https://www.yellowstonepark.com/things-to-do/yellowstones-petrified-forest still claims that trees found in multiple layers indicate forests over different time periods over tens of thousands of years …and of course it all happened 50 million years ago. The evidence doesn’t bear this out. We know the petrified forest doesn’t need 50 million years to form. We know it can happen very quickly …because we are seeing how it happened today!

Mount St. Helens Creation Center

If you’re in the Mount St. Helens area, you should stop by the Mount St. Helens Creation Center in Castle Rock, WA. There you may find biblical creationist Paul Taylor who is more than happy to answer questions about a variety of topics. There’s a video presentation area where talks are given, comfy chairs and a couch to sit in, complimentary coffee, a few book displays featuring his books as well as other creationists, some free brochures, and a number of exhibits and displays. We were only there an hour in total but could have easily spent half the day there. The center was quite busy before Paul had to shut it down for the excursions.

Benjamin Bryan and Paul Taylor at the Mount St. Helens Creation Center
Paul Taylor and Ben
Eli and Solar System
Eli Found a Solar System display…
Visitors at the Mount St. Helens Creation Center
Visitors at the Mt. St. Helens Creation Center
Seating Area at the Mount St. Helens Creation Center
Sitting area
Noak's Ark Model at the Mount St. Helens Creation Center
Model Ark
Keynote Speech by Paul Taylor for International Creation Day 2018

Lava Tubes

Between Excursion days we explored the Ape Cave and Lava Tubes

Lava Tubes
Eli Descending a Lava Tube
Lava Tube
Why is it taking you so long, dad?
Lava Tube
Oh good, he made it.
Lava Tube Map
Ape Cave
Ape Cave

West Side Excursion

This is the most popular excursion.

Near Mount St. Helens

This area is a national monument so there is no reforesting or replanting by humans, everything you see is a completely natural which shows how quickly plants recover. A lot of mud flowed through here.

Castle Lake
Castle Lake

Castle Lake didn’t exists before 1980… it was created by the eruption.

Mud from Mt. St. Helens cresting the Johnston Ridge

The landslide that was traveling so fast and far it crested this ridge (the smaller hills are from the landslide). We are walking on the Johnston ridge in this photo near the observatory about 5 miles from the volcano.

Lupine Flowers

Lupine is important to help the area recover, this flower helps other plants grow in volcanic areas by taking nitrogen out of the air and then sharing it with other plants through roots.

Canyon's with Rock Layers
Canyons with Layers much like the Grand Canyon

For as long as I can remember secular scientists have claimed layers of rock such as you see in the Grand Canyon must have formed over millions of years. This is not observational science, but an assumption made to fit the evolutionary narrative. From the Johnston Ridge with a pair of binoculars I can see the layers. If nobody where there to observe the Mount St. Helens evolutionary scientists today would say the rock layers here took millions of years to form just like they say the Grand Canyon did. But for Mount St. Helens we where there to observe it so we know how old these layers are. 25 feet of organized matter was laid down by the volcano in a very short time. How long did it take to form these 200 (conservatively) layers? 3 hours.

Even today, secular scientists will still defend a millions of years timeline for the grand canyon. E.g. https://geology.com/articles/age-of-the-grand-canyon.shtml When I was at the Johnston Observatory I didn’t see any displays or comments discussing the rapid formation of these layers. It’s one of the most fascinating features here but the only display you’ll find on it is at the Creation Center.

Graphic showing Mount St. Helens rock layers compared to Grand Canyon

Paul Taylor Conference at Kootenai Church May 2020

Paul Taylor is coming to speak at Kootenai Community Church in May of 2020 (which coincides with the 40th Year Anniversary of the eruption). If you’re up in North Idaho it will be well worth your time to attend. I expect registration for the conference will open up sometime in 2020 so watch the main website for registration if you’re interested (you can also leave a comment saying you’re interested and I’ll email you when registration opens).

If you’re interested in the Excursions contact the Mount St. Helens Creation Center to book one: https://mshcreationcenter.org/visit/excursions/

Final Thoughts

One person that was on the excursion with us was there right before it happened and actually had taken pictures months before the eruption. It was also neat to talk to some of the locals who witnessed the event and how it personally impacted them. I even talked to some people in North Idaho who remember a strange cloud interrupting a sunny day and covering the area with ash. It’s fascinating to listen to all their stories.

This was a great trip, fun for our family and a good learning experience. It was also enjoyable because I didn’t check work email the entire time (I did take my laptop just in case and my coworkers knew they could text or call for anything critical). They didn’t have to contact me once (thanks for everyone who worked hard to make that happen).

During the excursion Paul reminded us that 2 Peter 3 tells us in the last days that scoffers will deny two events: Creation and the Flood.

For they deliberately overlook this fact,
that the heavens existed long ago,
and the earth was formed out of water
and through water by the word of God,
and that by means of these the world that then existed
was deluged with water and perished.

2 Peter 3:5-6

You’ll notice from 2 Peter 3 that overlooking Creation and the Flood isn’t a result of ignorance. It is deliberate. It is not that there isn’t compelling evidence. Evidence is staring the evolutionary scientist in the face. Observational science won’t change them because they already know the truth and suppress it. Rather, it is an issue of lack of belief in Jesus Christ. So while the evidence found at Mount St. Helens has value and confirms the position of biblical creationists, evidence is not what we base our beliefs on. Rather, we base our beliefs on God’s Word. Evidence is not the means that will transform an unbeliever into a believer. Rather God has chosen to use the power of the gospel for that task.

Jupiter – First Attempt at Stargazing with a Telescope

On clear nights I would often take Eli outside to look at the stars when he was a toddler. I told him the names of a few stars, he asked me to tell him the names of all the stars. I found out quickly young eyes are better for looking at stars, he could see a lot more of them than me. This evening we got a chance to look at Jupiter and two of it’s moons using a telescope!

Right now it’s close enough you can see Jupiter’s moons with a good pair of binoculars.

Jupiter and two moons
Jupiter and 2 moons

Finding Jupiter was the easiest part. Early evening it was right where it should be.

Jupiter

The most difficult part was pointing the telescope at that star. We borrowed a telescope (thanks Sean!) and after lots of randomly fiddling with the various undocumented knobs I finally figured out which ones did X another X, Y, Z, Y again, another Y, a yawing Y, and some sort of arc, and got it pointed towards Jupiter!

The earth of course is rotating so I had to re-align it every minute or so. I think it would be a great idea for someone to make a telescope with built in gyroscopes so they will stay pointed in a particular direction.

Setting up a Telescope
There’s a lot of knobs on this telescope. Just tell me when you see something in the scope!
Looking through a telescope
Kris looking through the scope while Eli looks through the spotter scope.
Jupiter and a moon using Pixel Night Mode
Pixel Night Mode

Eli looking through telescope
Trying to move away from the light pollution.
Jupiter and two moons
Another shot of Jupiter and two moons.
Jupiter and moon positions

I’m guessing the moons we saw were Europa and Callisto since Io and Ganymede would have been transiting Jupiter at the time we were looking.

Jupiter and 2 moons

The heavens declare the glory of God,
and the expanse proclaims his handiwork.


Psalm 19:1-6 ESV

I switched to Duplicati for Windows Backups and Restic for Linux Servers

So long, CrashPlan! After using it for 5 years, CrashPlan with less than a day notice decided to delete many of my files I had backed up. Once again, the deal got altered. Deleting files with no advanced notice is something I might expect from a totalitarian leader, but it isn’t acceptable for a backup service.

Darth Vader altering the deal
I am altering the deal. Pray I don’t alter it any further.

CrashPlan used to be the best offering for backups by far, but those days are gone. I needed to find something else. To start with I noted my requirements for a backup solution:

  1. Fully Automated. I am not going to remember to do something like take a backup on a regular basis. Between the demands from all aspects of life I already have trouble doing the thousands of things I should already be doing and I don’t need another thing to remember.
  2. Should alert me on failure. If my backups start failing. I want to know. I don’t want to check on the status periodically.
  3. Efficient with bandwidth, time, and price.
  4. Protect against my backup threat model (below).
  5. Not Unlimited. I’m tired of “unlimited” backup providers like CrashPlan not being able to handle unlimited and going out of business or altering the deal. I either want to provide my own hardware or pay by the GB.

Backup Strategy

Relayed Backups

This also gave me a good opportunity to review my backup strategy. I had been using a strategy where all local and cloud devices backed up to a NAS on my network, and then those backups were relayed to a remote (formerly CrashPlan) backup service. The other model is a direct backup. I like this a little better because living in North Idaho I don’t have a good upload speed so in several cases I’ve been in situations where my remote backups from the NAS would never complete because I don’t have enough bandwidth to keep up.

Now if Ting could get permission to run fiber under the railroad tracks and to my house I’d have gigabit upload speed, but until then the less I have to upload from home the better.

Direct Backups

Backup Threat Model

It’s best practice to think through all the threats you are protecting against. If you don’t do this exercise you may not think about something important… like keeping your only backup in the same location as your computer. My backup threat model (these are the threats which my backups should protect against):

  1. Disasters. If a fire sweeps through North Idaho burning every building but I somehow survive I want my data. So must have offsite backups in a different geo-location. We can assume that all keys and hardware tokens will be lost in a disaster so those must not be required to restore. At least one backup should be in a geographically separate area from me.
  2. Malware or ransomware. Must have an unavailable or offline backup.
  3. Physical theft or data leaks. Backups must be encrypted.
  4. Silent Data Corruption. Data integrity must be verified regularly and protected against bitrot.
  5. Time. I do not ever want to lose more than a days worth of work so backups must run on a daily basis and must not consume too much of my time maintaining them.
  6. Fast and easy targeted restores. I may need to recover an individual file I have accidentally deleted.
  7. Accidental Corruption. I may have a file corrupted or accidentally overwrite it and may not realize it until a week later or even a year alter. Therefore I need versioned backups to be able to restore a file from points in time up to several years.
  8. Complexity. If something were to happen to me, the workstation backups must be simple enough that Kris would be able to get to them. It’s okay if she has to call one of my tech friends for help, but it should be simple enough that they could figure it out.
  9. Non-payment of backup services. Backups must persist on their own in the event that I am unaware of failed payments or unable to pay for backups. If I’m traveling and my CC gets compromised I don’t want to not have backups.
  10. Bad backup software. The last thing you need is your backup software corrupting all your data because of some bug (I have seen this happen with rsync) so it should be stable. Looking at the git history I should be seeing minor fixes and infrequent releases instead of major rewrites and data corruption bug fixes.
Raspberry Pi and 4TB drive on wooden shelf
Raspberry Pi 4TB WD Backup

My friend Meredith had contacted me about swapping backup storage. We’re geographically separated so that works to cover local disasters. So that’s what we did, each of us setup an SSH/SFTP server for the other to backup to. I had plenty of space on my Proxmox environment so I created a VM for him and put it in an isolated DMZ. He had a Raspberry Pi and bought a new 4TB western digital external USB drive that he setup at his house for me.

Duplicati Backup Solution for Workstations

For Windows desktops I chose Duplicati 2. It also works with Mac, and Linux but for my purposes I just evaluated Windows.

Duplicati screenshot of main page

Duplicati has a nice local web interface. It’s simple and easy to use. Adding a new backup job is simple and gives plenty of options for my backup sets and destinations (this allows me to backup not only to a remote SFTP server, but also to any cloud service such as Backblaze B2 or Amazon S3).

Animation of setting up a duplicati backup job

Duplicati 2 has status icons in the system tray that quickly indicate any issues. The first few runs I was seeing a red icon indicating the backup had an error. Looking at the log it was because I had left programs open locking files it was trying to back up. I like that it warns about this instead of silently not backing up files.

Green play icon
Grey paused icon
Black idle icon
Red error icon

Green=In Progress, Grey=Paused, Black=Idle, Red=Error on the last backup.

Duplicati 2 seems to work well. I have tested restores and they come back pretty quickly. I can backup to my NAS as well as a remote server and a cloud server.

Two things I don’t care for Duplicati 2.

  1. It is still labeled Beta. That said it is a lot more stable than some GA software I’ve used.
  2. There are too many projects with similar names. Duplicati, Duplicity, Duplicacy. It’s hard to keep them straight.

Other considerations for workstation backups:

  • rsync – no gui
  • restic- no gui
  • Borg backup – Windows not officially supported
  • Duplicacy- License only allows personal

Restic Backup for Linux Servers

I settled on Restic for Linux servers. I have used Restic on several small projects over the years and it is a solid backup program. Once the environment variables are set it’s one command to backup or restore which can be run from cron.

Screenshot of restic animation

It’s also easy to mount any point in time snapshot as a read-only filesystem.

Borg backup came in pretty close to Restic, the main reason I chose Restic is the support for backends other than sftp. The cheapest storage these days is object storage such as Backblaze B2 and Wasabi. If Meredith’s server goes down, with Borg backup I’d have to redo my backup strategy entirely. With restic I have the option to quickly add a new cloud backup target.

Looking at my threat model there are two potential issues with Restic:

  1. A compromised server would have access to delete it’s own backups. This can be mitigated by storing the backup on a VM that is backed by storage configured with periodic immutable ZFS snapshots.
  2. Because restic uses a push instead of a pull model, a compromised server would also have access to other server’s backups increasing the risk of data exfiltration. At the cost of some deduplication benefits this can be mitigated by setting up one backup repository per host, or at the very least by creating separate repos for groups of hosts. (e.g. a restic repo set for minecraft servers and separate restic repo for web servers).

Automating Restic Deployment

Obviously it would be ridiculous to configure 50 servers by hand. To automate I used two Ansible Galaxy roles. I created https://galaxy.ansible.com/ahnooie/generate_ssh_keys which automatically generates ssh keys and copies the key ids to the restic backup target. The second role https://galaxy.ansible.com/paulfantom/restic automatically installs and configures a restic job on each server to run from cron.

Utilizing the above roles here is the Ansible Playbook I used to configure restic backups across all my servers. This sets it up so that each server is backed up once a day at a random time:

Manual Steps

I’ve minimized manual steps but some still must be performed:

  1. Backup to cold storage. This is archiving everything to an external hard drive and then leaving it offline. I do this manually once a year on world backup day and also after major events (e.g. doing taxes, taking awesome photos, etc.). This is my safety in case online backups get destroyed.
  2. Test restores. I do this once a year on world backup day.
  3. Verify backups are running. I have a reminder set to do this once a quarter. With Duplicati I can check in the web UI, and with a single Restic command it can get a list of hosts with the most recent backup date for each.

Cast your bread upon the waters,
for you will find it after many days.


Give a portion to seven,

or even to eight,
for you know not what disaster may happen on earth.

Solomon
Ecclesiastes 11:1-2 ESV

Dr. Jason Lisle Conference

When Eli was 4, Kris and I needed to keep him occupied while we packed our house up ahead of a move… so I found a video by Answers in Genesis and put it on. Eli discovered an interest in the solar system. He liked it.

A lot.

For the next several years he studied the planets.

Made drawings of planets

Created the asteroid belt…

Made the solar system out of balls…

Made the solar system out of dinosaurs…

Represented it with Legos…

Another Lego Solar System…

Drew Orbits.

His interest in astronomy has not waned so we were excited when we found out Dr. Jason Lisle was coming to our church for a conference. It was held last Friday/Saturday and one of the best conferences I’ve attended.

Dr. Lisle is a Christian astrophysicist. Has held positions at Answers in Genesis, Institute for Creation Research (ICR), and now the Biblical Science Institute. One of my favorite papers by him is about the Anisotropic Synchrony Convention and I’ve enjoyed his debate with Hugh Ross and read a number of Jason Lisle’s Books over the years.

His sessions covered a foundation on Genesis, Astronomy, Science, Fractals, and a few Q&A sessions (he answered Kris’s questions about the multiverse and Kuiper Belt). The conference was recorded so if you’re interested you can watch it below:

Eli’s second love is math, so his favorite part was the session on how God thinks about numbers, which spent a good deal on fractals on Day 2.

Earlier this year while attending a home school curriculum conference I was reminded that there is no subject that can be taught from a neutral perspective. You are either for God or against God. Even math. You can either teach it from a secular perspective which is to ignore God and avoid the question of where math comes from or perhaps try to come up with some explanation about why math exists and even works; or you can teach it from the worldview that that all things including abstract things like math are created by God and therefore have beauty and reflect His nature. By studying math we are discovering what God thinks about numbers and would therefore expect to find beauty in numbers. There is no neutral position. Dr. Lisle’s presentation made this more apparent.


“I was merely thinking God’s thoughts after him.” – Johannes Kepler

How to Get Longer Life Out of Your Dell Laptop Battery

In 2015 I bought myself and Kris Dell Latitude E5450 laptops. 1 year later her battery was fine, however mine lasted 60 seconds on a full charge. I attribute this to Kris often using her computer on battery and not having it plugged in all the time, and me always having my computer in the docking station so it’s constantly charging.

60 seconds of run-time

I lived with a bad battery for 3 years… 60 seconds is enough to run from one outlet to the next without having to power down… which is really all I need. Although I’ll admit 120 seconds would be nice!

Battery Swelling Issue

A couple weeks ago I noticed a crack near my touchpad… and a bulge. My laptop was growing! Or rather, the battery was expanding! The battery pack is about 175% the height of what it should be!

That Dell battery pack on the left is a little swollen….

I quickly waited a few months, and decided that despite the battery still giving me my 60 seconds, this could be a safety or fire risk or my laptop might break if it swells much more, so out of prudence decided to buy a new Dell G5M10 battery. After installing it I went into the BIOS and noticed settings to change how Dell manages the battery! You can opt for faster charging, more run-time, or more longevity.

Here are the batttery life settings.

Charge Time, Run Time, or Lifespan. Pick any 1, sometimes 2.

  • ExpressCharge – Faster charging. This was the default! The problem is the faster you charge a battery the more you cause it to wear out sooner. This makes sense for people on the road who don’t have a lot of time to recharge. But it doesn’t make sense if you’re almost always on AC power like me. This setting probably has a high charge stop up to maximum capacity (100%?) and high custom charge start (95%) so that it’s always ready. I’m not an expert in batteries, but I believe batteries naturally lose power over time so each time it drops 5% of it’s power it charges back up to 100%… those constant charge cycles cause a lot of wear not to mention the battery is being held at full charge which causes it to degrade faster. Running in this setting is giving you the best performance but you’re pushing the limits.
  • Standard – This is the same as ExpressCharge as far as I can tell but a little slower charge. Other than that it’s still going to wear the battery out fast.
  • Primary AC User – Designed to extend the battery lifespan for laptops that are usually plugged in. I assume this does two things: It probably slows down the charge rate, sets the Charge stop to a lower value like 70%, and sets the charge start to around 50% (I’m completely guessing at these numbers). This reduces the number of charge cycles needed to maintain the battery and is generally charging the battery up to levels suitable for long-term storage instead of maximum performance giving you the best lifespan at the cost of run-time. If you want longevity at the cost of run-time this is the setting you want.
  • Adaptive – This is what the default should be! It’s a trade-off between the two. It optimizes battery settings based on how you typically use the computer. Meaning if you’re running on AC power all the time it will act more like the Primary AC power setting, but if it sees you are using the battery a lot it will start behaving like the ExpressCharge.
  • Custom – Could also set custom values

Dell BIOS Settings for Battery Maintenance

Optimizing Battery for both performance and longevity depending on the time of day

This will only drive your battery hard when you might need the run-time, but go easy the rest of the time. If you have a fixed schedule you can tell your Dell laptop what times of day you need more run-time. But then outside of those hours it will maximize longevity.

Dell BIOS Settings for Battery

Well, I’ll be changing my BIOS setting to Primary AC User.

And with my brand new battery I’m liking the new 4-hour run-time again. Now days I walk from outlet to outlet instead of running.

How to Get Longer Life Out of your ThinkPad Battery

If you use a ThinkPad read this KB on How to Increase Your Battery Life by changing the Battery Maintenance settings.

Running Chrony NTP under a VMware Guest

Here’s a quick guide to run an NTP (Network Time Protocol) server using Chrony with a GPS (optional) receiver on a VMware ESXi Guest running Ubuntu 18.04.  I should note this is experimental and something I setup in my homelab temporarily.  For production environments I would run NTP on physical hardware and not VMware.

Create and Configure VM

Be sure to disable Guest Tools Time synchronization by editing the VM settings and uncheck Synchronize guest time with host.

Disable VM Tools Time Synchronization

Set the CPU shares to High… we want the NTP server to have priority if there is processor contention.

High CPU Shares

Install Chrony

I diversified between Ubuntu’s, NTP.org’s and NIST’s time server pools.

That’s it, after restarting the chrony service (service restart chrony) you should be able to get time reports by running:

Why You Shouldn’t Run an NTP Server in a VM Guest

VM’s can’t keep accurate time

I’ve generally found that VMs keep great time inside of VMware.  One thing that can help with this is setting the CPU shares to high so your time server always has a priority.  I ran Chrony in a VM for several weeks, compared it with Chrony on a Raspberry Pi.  Both were acceptable, and both had a smaller standard deviation than public NTP servers over the internet, but the VM had a much smaller standard deviation than the Pi.  That tells me VMs running on better hardware may be better than lesser bare physical hardware at time tracking under certain conditions, and a local NTP server in a VM can be more precise than grabbing time off the internet.

VMs can become out of sync during snapshots, suspend, failover, etc.

I ran a suspend test and this is true.  I paused a VM, waited 10 seconds, then resumed it.  It reported the wrong time to NTP clients for several minutes before it corrected itself from external NTP servers.  Here’s a screenshot of my NTP server being 11 seconds off after a pause!

Chrony after VMware Suspend

This is a valid reason to run an NTP server on physical hardware.  However, I think it is possible to run an NTP server under VMware with the following precautions:

  1. Your NTP servers under VMware should never be paused.  That means they should be excluded from failover (instead of failover it’s better to configure multiple NTP servers for your clients to connect to since it’s better for an NTP server to be down than report a wrong time).
  2. Have multiple NTP servers.  At least three. You’ll notice in the screenshot above Chrony (running on a separate physical machine) flagged the server as not being accurate.  This way if one of your VMs gets paused chrony will switch to another time-source automatically.
  3. Set makestep 1 -1 in the chrony.conf file (this tell chrony that any difference greater than one second will get stepped which allows for faster correction after a resume).

GPS Receiver

This is not really related to VMware.  But I had a GPS receiver so thought I’d see how it works with Chrony….

GlobalSat GPS Receiver

I have a GlobalSat BU-353S4 USB GPS Receiver.  This isn’t the best GPS receiver for accuracy.  For me it’s accurate to within a few hundred milliseconds which is good enough for my experimental purposes but worse than just grabbing time off the internet.  For serious time-keepers you’ll be wanting to use something faster than USB and more accurate than what a cheap GPS receiver can provide.

Configure gpsd

Install Chrony

So, how did I get the values on the refclock line…

The way I came up with my offset of 0.250 is by initially setting the offset to 0.0, restarting chrony, and running chronyc sources -v several times taking note of the offset.  I’d get numbers like +249ms, +253ms, +250ms, etc.

Since my GPS is off by about 250ms I set the offset to 0.250.  Now it’s usually not off by more than 100ms.

Chrony Sources

The 100ms+- variance is not a problem when being combined with other sources, but if it was the only time source I’d be better off tolerating drift than the high variance of GPS for a short period without access to the NTP pools, if I had no internet for several months or an air-gapped network then time via GPS would probably be better than nothing–but a better GPS receiver should be used in those scenarios.

For most networks running chrony in a VM and using a GPS is unnecessary.  It’s better to keep it simple.  I just use the NTP service on my pfSense router and set all the clients to that.

Don’t forget to watch your clocks adjust themselves next Sunday!

Programming Management & Leadership Books

There are plenty of books on managing people; but there are few books targeting management of software development, and even fewer aimed at people who got promoted into leadership positions with no management skills.  I’ve read countless books looking for resources in that area…  I can find plenty of books about how to manipulate people or promote yourself (and I’ve had plenty of training to that affect) but those are not the books I’m looking for.

I want real authentic leadership and practical management.  Below you will find the best of what I’ve found over the last four years. And unlike some “Best Books for Programming Managers” and “Top 10 books on Leadership” lists you’ll find online… I actually read every book listed below. 

I should also note that even if you aren’t in a position of management these books should be beneficial.  Whether you have the position or not, everyone has the opportunity to lead.

Managing the Unmanageable

Managing The Unmanageable Book

“Most successful programming managers are former programmers: They can quickly grasp whether a developer is on track through the most informal of conversations, without having to ferret out the assessment through long strings of questions that can feel pestering.”



Managing the Unmanageable By Mickey W. Mantle and Ron Lichty (2012)

Managing the Unmanageable is the comprehensive handbook to gain a variety of insights and a tool set to manage software development teams.  I didn’t find it lacking coverage on any topic.

It rightly points out how managing programmers is like managing artists–programming is a creative job so you can’t manage that the same way you would manage most other jobs.

It goes over how to build relationships with and manage HR, your boss, other departments, etc.  How to define developer levels, how not to do incentives (which can often be more demotivating than motivating), job descriptions, how to conduct interviews, build culture, motivate developers, etc.  This is a wide book in what it covers.  The vastness of topics is unmatched by any other management book I’ve read.  It may only devote a few pages to some subjects but I haven’t found an area that it doesn’t cover at all. Even for areas it doesn’t go into great depth it references sources for further study.

I think this is the best resource for a new manager to get a comprehensive overview of every topic related to managing programmers.  What I really like about the book is from the experience of the authors it anticipates and provides guidance on a lot of challenges I had to deal with–reading this book helped me proactively plan how to deal with those situations.

For me, reading Managing the Unmanageable is like sitting down at a coffee shop with some seasoned managers and listening to their experience and wisdom.  Today I still use it as reference book.

Peopleware

Peopleware Book on Productive Projects and Teams

“The major problems of our work are not so much technological as sociological in nature.” 

“Most managers are willing to concede the idea that they’ve got more people worries than technical worries.  But they seldom manage that way.  They manage as though technology were their principal concern.  They spend their time puzzling over the most convoluted and most interesting puzzles that their people will have to solve, almost as though they themselves were going to do the work rather than manage it.”



Peopleware: Productive Projects and Teams (3rd Edition) by Tim DeMarco & Timothy Lister (originally published in 1987, I read the 3rd edition published in 2013)

Peopleware, as it’s title suggests is all about the people aspect of managing software developers.  It’s not a generic management book.  Most of it only applies to managing creative and intellectual workers.  It covers why programmers are distinct from and must be managed differently than other types of jobs, such as accountants or manufacturing workers.  The book covers topics like the importance of allowing time to think on the job, giving teams a sense of elitism to increase productivity, creating environments where teams can naturally form and jell, the importance of an interruption free office environment, why the surest way to improve productivity is by focusing on quality.

I learned environmental factors for a programmer cause a 10 to 1 performance difference.  A large section deals with the work environment.  Office design, layouts, how bad cubicles are, the importance of natural light, office size, privacy, etc.  This is a timeless classic.  It would benefit any manager, executive, head of HR, architect, or programmer (even if you aren’t in a management position, this book will help you manage yourself).

The Mythical Man-Month

The Mythical Man-Month

“Why is programming fun? What delights may its practitioner expect as his reward? First is the sheer joy of making things. As the child delights in his mud pie, so the adult enjoys building things, especially things by his own design. I think this delight must be an image of God’s delight in making things, a delight shown in the distinctness and newness of each leaf and each snowflake.”

The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition (2nd Edition) by Fred Brooks (originally published in 1975, I read the 20th Anniversary edition published in 1995)

This is a collection of essays about managing and organizing large software projects. Most important is Brooks’ observation that adding more man-power to a late software project will make it even later. My favorite observation of his was how the most productive teams are smaller because of the communication overhead, you only get fractional gains by increasing the size of large teams. Although pre-Agile, many of his ideas influenced Agile project management. He was well ahead of his time. This is a classic. 

“Adding manpower to a late software project makes it later.”

The Conviction to Lead

The Conviction to Lead

“Whenever Christian leaders serve, in the church or in a secular world, their leadership should be driven by distinctively Christian conviction.”

“Leadership is all about putting the right beliefs into action, and knowing, on the basis of convictions, what those right beliefs and actions are.  This book is written with the concern that far too much of what passes for leadership today is mere management.  Without convictions you might be able to manage, but you cannot really lead.”

The Conviction to Lead: 25 Principles for Leadership That Matters By Albert Mohler, 2014

This was not an easy find. I read fluffy leadership book after fluffy leadership book… and finally read Mohler’s book at my dad’s recommendation.  It has far more substance on leadership than anything else I’ve read.  Where others give you mechanics, tools and methods, Mohler gives you conviction and motivation based on well grounded beliefs.  It is not written just to pastors, nor just to leaders of Christian institutions (although this appears to be the main focus), but also to Christians who happen to be leaders in secular organizations–and that’s quite rare for a book on leadership written by a devout Christian.

Mohler’s book is practical because it provides the foundation for why and how Christians should be leading and the basis for leading in a secular world.  I would say the book is primarily written to C-level, but almost all of it I was able to apply to a smaller realm for lower levels of management if I limited the scope to my area of influence.  This is a good book for any Christian in a position of leadership.