b3n.org https://b3n.org/ Benjamin Bryan | Self-hosted from North Idaho Sun, 28 Jan 2024 20:33:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://b3n.org/wp-content/uploads/2023/03/apple-touch-icon-150x150.png b3n.org https://b3n.org/ 32 32 52630963 30 Years of Quicken https://b3n.org/30-years-of-quicken/ https://b3n.org/30-years-of-quicken/#comments Sat, 27 Jan 2024 15:56:22 +0000 https://b3n.org/?p=116851 Well, I have now been using Quicken for thirty years, spanning three operating systems. I can’t think of any other software I’ve used this long. I started with Quicken for DOS, moved to Quicken for Windows, and eventually landed on Quicken for Mac. Quicken comes from an era when software was useful, comprehensive, and required ... Read more

The post 30 Years of Quicken appeared first on b3n.org.

]]>
Well, I have now been using Quicken for thirty years, spanning three operating systems. I can’t think of any other software I’ve used this long. I started with Quicken for DOS, moved to Quicken for Windows, and eventually landed on Quicken for Mac.

Quicken comes from an era when software was useful, comprehensive, and required reading the manual. But, the people of today don’t read instructions. Modern software is overly simple, laser-focused on a few things, and all two features are discoverable. The Quicken program, as powerful as it is would not be written today.

Quicken for DOS

My personal finance tracking started when Dad brought home an old computer with a monochrome amber monitor. It had QBasic (with two games: Gorilla and Nibbles), WordPerfect for DOS (which is the greatest program ever written to this day), and Quicken for DOS, and I purchased my favorite program of all time: Microsoft Flight Simulator. Anyway, Dad showed me how to maintain a ledger. I set up two accounts:

  1. Cold Hard Cash (my wallet)
  2. Bank of Dad (self-explanatory)

The basic theme of stewardship is that we are responsible before God for how we use the goods, services, and resources that are at our disposal. That means that a Christian steward is to be careful not to be wasteful with them. We need to measure the value of things we buy

— R. C. Sproul, How Should I Think about Money, 1st Ed
(free digital version available at Amazon or Ligonier)

Quicken for Windows

I switched from DOS to Quicken for Windows in 2000 (give or take), and it went along nicely for the next 14 years. Quicken is so comprehensive that it has tools to handle every scenario Kris and I have come across:

  1. Cash management (categorizing transactions, monitoring spending, billpay, etc.)
  2. Automatic downloads and auto-reconciliation from Financial Institutions
  3. Expense Tracking.
  4. Budgeting.
  5. Investing – Brokerage accounts (Capital gains tracking, lot tracking for tax loss harvesting, etc.)
  6. Multiple Currencies
  7. Retirement Planning (I don’t ever plan to retire since I don’t see a biblical precedence for it. God intended us to work. But I think it’s wise to invest for such since one never knows how capable he will be in the future).
  8. Business transactions/invoicing clients.
  9. Buying a house/home loan/looking at early payoff what-if scenarios
  10. Tracking assets (cars, houses, etc.), and depreciating assets.
  11. Becoming an accidental landlord
  12. Itemized taxes, importing data into Turbotax.
  13. Memorizing transactions based on rules (which is mostly automatic). There’s very little manual data entry as most things are automated.
  14. Tracking medical bills and matching them with health insurance reimbursements…

Now, we must pause here and take this moment to rant about health insurance –> The health industry is disorganized–sometimes you don’t get invoices until a year after the service that you thought was all paid for from the sub-sub-sub-contractor of your doctor, and in the meantime, your employer switched health insurance providers in the middle of it, and sometimes you’ll get duplicate invoices and if you’re not keeping track of it you’ll pay them twice–sometimes you’ll get past due statements out of the blue from a collection agency having never received an invoice, and you’re wondering who are these people and why do I owe them money? The statements have no valid medical or diagnosis codes, and then the insurance won’t reimburse you unless you Fax that over. Yes, I said Fax because that is the only secure method to send medical codes other than snail-mail. So you sign up for a Fax service. But you don’t know the medical codes–the provider gave them codes that they insist are good, but the insurance says they’re the wrong codes. The biller doesn’t know what codes the insurance is looking for, but that’s not their problem; you’re the “responsible payer” so they still want their money. So you pay them so they don’t ruin your credit and you’ll just have the insurance pay you back later. You offer to help by writing the proper codes on the invoice and re-faxing it to the insurance company. But your insurance doesn’t want to tell you what the correct codes are because that’s a secret. Nah na na Nah na na. You Google and find a few codes but after a few months the insurance sends you a “Nope, that’s not the right code” letter. You begin to suspect that only one person in the world knows the codes, and he’s dead. So you arrange a 4-way conference call between yourself, your insurance, your provider, and the sub-sub-sub provider. If they won’t tell you the secret codes, maybe they’ll at least share it with each other. If they wanted to be secure they could even Fax the codes like secret agents. But the representatives are all in outsourced call-centers in foreign countries speaking broken English and all reading off scripts to each other–finally, you realize you’re not the customer here at all because you’re not paying for the insurance directly, your employer is. So you talk to your HR department who goes on the war path. They find out that even though the doctor is in-Network, and the Sub-contractor is In-Network, the Sub-Sub-contractor that hired the Sub-Sub-Sub Contractor is not In-Network even though the Sub-Sub contractor is. You should have asked. Your fault. Pay up. Wait a minute… it shouldn’t matter; because (1) you’re on the PPO plan and (2) you went way over max OOP. It should be covered. So you tell HR that. “Oh, you were on the PPO plan? Yep you were.” Anyway, a few weeks later HR gives you good news. They convinced the insurance company to reimburse you. Yay! That’s one of 14 bills your insurance needs to pay out, but getting one covered is a good start! So you wait for a check. Four months later you finally get the check. You open the check. You look at the check. The check insults you. Why is it only $25.00? That covers 0.008% of the bill. You call them up–yes, it was somehow coded as a doctors visit and that’s the agreed upon rate. At this point you start adding up how much you’ve paid in premiums over your lifetime and realize if you had simply invested your health insurance premiums in the S&P 500 instead, you’d not only be able to pay for this out of pocket, but you’d have an extra couple million dollars which is well over the maximum insurance lifetime payout anyway. But, you decided to pay the premiums. And that means you really need insurance to cough up a little more than $25. Literally, insurance has one purpose, and it is not meeting that purpose. So you warm up your fax machine and get on the phone and start the process all over again. Despite being a little upset at your health insurance for making you go through all these hoops, you’re just thankful to have Quicken to keep all this organized. Quicken’s robust AR and AP capabilities are useful not only to small business owners but also to medical patients.

Quicken won’t stop medical bills from making you go broke, but it does a fantastic job at tracking how broke you are.

Nothing is more important than your health… except for your money.

– Ferengi Rules of Acquisition #23

I don’t know what happened to quality, but it appears to me that after Intuit acquired Mint, Quicken lost focus, made too many changes, and introduced simplicity in such a way it made things slow and more complex. It’s like they tried to add features from Mint… but Quicken is not Mint and the two different methods of finance management are incompatible. Quicken 2014 is the buggiest release ever made. While I am a huge fan of Quicken there is an 8-year span between 2014 and 2022 I would not have recommended Quicken very highly, but I don’t know of any alternatives. I still managed to make it work for us until I hit some limit in 2021…

The Great Quicken Crash of 2021

In 2021, Quicken kept corrupting my data. I had a good file, but then if I added a few transactions, it would corrupt the entire file, and Quicken would crash and not be able to open the file again. With decades of data, maybe I went past the 10,000 GDI limit, or maybe I went over 2,147,483,648 transactions, hitting a 2^32 (signed) limit somewhere. Who knows? I returned to older backup files, and I deleted older transactions to make the file smaller, but every time I’d sync up my latest transactions, it would crash.

I gave up for a while–I tried to use Mint (now defunct). I’d say Mint is good at tracking spending, but it is not that good at overall finance management, and it couldn’t do basic investment tracking. It isn’t that good to fly blind, but Mint was better than nothing.

Quicken for Mac

Sixteen of the thirty-eight parables of Jesus deal with money. One out of ten verses in the New Testament deals with that subject. Scripture offers about five hundred verses on prayer, fewer than five hundred on faith, and over two thousand on money. The believer’s attitude toward money and possessions is determinative.

― John F. MacArthur Jr.

After switching back to Macs earlier last year, the thought occurred to me that Quicken for Mac might be more robust. It turns out a different team writes the Mac version. I bought Quicken for Mac and imported all our Quicken Windows data from the last good backup in 2021. I synchronized transactions to catch up (which is what crashed Quicken for Windows), and it worked! I then imported the last two years of lost data from Mint (unfortunately, Mint doesn’t track everything correctly, so I had to do a few balance adjustments. It’s annoying to have a gap but I don’t think I’m missing too much data), and we’re back in business!

What I like most about Quicken for Mac is it’s not an Electron app. My hats off to any app that is not built on Electron.

Secondarily, it’s the most comprehensive finance app I know that doesn’t crash or slow down when loaded with decades of personal financial history.

Quicken is fully Mac native and insanely fast. Certain financial transactions that would hang Quicken for Windows for several seconds are instantaneous on the Mac version. The backend is no longer a proprietary file but a standard SQLite database–I was able to open it right up in an SQLite DB browser.

A few thoughts on Quicken for Mac vs. Windows.

  1. 🍎 The Mac version has not crashed even once. More importantly, it hasn’t corrupted my data. If you’ve been using Quicken for Windows for 30 years and keep having trouble with it crashing and being unstable, give the Mac version a try.
  2. 🍎 The Mac version is faster.
  3. 🍎 For Brokerage accounts, I ran into precision limits in Windows; sometimes, this has caused the number of shares between Quicken and my Brokerage to become mismatched due to rounding, which is quite annoying when you’re doing lot tracking and over time you end up off from what you need to report on taxes. The Mac has more digits of precision. I tested to 8 digits after the decimal without issue (incidentally, that’s enough to track cryptocurrencies like Bitcoin). In Windows I constantly had to add placeholders due to rounding issues. So far, not a single investment transaction has gone out of sync on Mac. I haven’t had to enter a single placeholder to correct a holding.
  4. 🍎 The Budgeting feature on Mac is slow when doing calculations (but it’s even slower on Windows), and the interface is clunky. You can’t edit a budget directly from the budget screen but have to go into edit mode (why?). I’d say budgeting is very cumbersome on Mac and Windows–2012 was the last version of Quicken with a robust budgeting tool. Quicken for DOS puts both to shame.
  5. 🪟 The Mac version does not have great support for Business. There is no Rental edition, they just added a Business edition but it doesn’t have a lot of features yet. This is not a huge problem because you can still create categories mapped to items on Schedule C and E. You can easily create tags to track multiple businesses.
  6. 🪟 The Windows version allows you to create custom Asset classes–can’t do that on Mac.
  7. 🪟 A large feature missing from the Mac version is Invoicing. I use the invoicing and receivables feature quite a bit, so I will miss this. I can still maintain a separate Quicken for Windows file to generate invoices, so it’s not a big deal but a bit annoying.
  8. 🪟 I can’t seem to find a mass Find/Replace feature in Mac. Windows is a lot better here
    • 🍎 …But since this is a database on the Mac, you can just write a SQL query to do it.
  9. 🍎 E-Billing. I never could get E-Billing to work right in Windows, but it works fine in Mac, even bringing in PDFs of the bills.
  10. 🪟 Scheduled transactions often double-up from downloaded transactions (Windows did better at matching them automatically), but on Mac you can drag the transactions into each other to match.
  11. 🍎 Mobile sync is more robust on Mac. Every time I did a sync to mobile on Windows, it would complain about corrupted transactions (I suspect from there not being enough decimal precision on brokerage accounts). So far, I have had no warnings syncing to Quicken for iOS.
  12. 🪟 On Mac there are too many sync connectors for each financial institution with no documentation on how they differ–on Windows, it is pretty straightforward. But on Mac, I’ll sometimes see 3 or 4 connectors for the same bank, and you just have to try them until I find one that works. A description of what they do would be nice.
  13. 🪟 The projected balance on Mac works fine for one account at a time (this works for me because I mainly care about the balance in the main account), but Windows can do projected balance and cash flow management for multiple checking accounts on one screen.
  14. 🪟 The scheduled transfers (Bill payment prediction) on Mac can’t predict credit card payments based on the current balance–a standard feature in Windows. Now, it will bring in the statement balance for most banks, so it’s only a minor issue for the few banks that don’t sync that info to Quicken.

Overall, between the two, I would pick the Windows version (even running it under Parallels) for the extra Business (especially invoicing) features, but since I literally can’t move past the year 2021 in the Windows version without it crashing, I’m using the Mac version and maintain a separate Windows Quicken file just for invoicing. The Mac version of Quicken lacks some features of the Windows counterpart but is a more stable program.

Quicken is the worst personal finance management tool, except for all the other ones that have been tried.

– Winston Churchill

(Well, he would have said that if he were still alive today)

Amazingly, Quicken has survived several acquisitions and a disaster of merging with Mint’s culture (but not without a lot of damage to it’s reputation). In recent years Eric Dunn took over leadership of Quicken and it does seem like he’s done a good job at restoring Quicken’s culture and bringing focus back to quality of the product. For awhile I wasn’t sure Quicken even had a roadmap other than to make things slower and buggier with each release. But I appreciate that he has been providing good direction and communicates the overall roadmap with the user base.

Quicken (now Quicken Classic) is the most comprehensive personal finance program ever written. Hopefully, it will be around another 30 years.

The post 30 Years of Quicken appeared first on b3n.org.

]]>
https://b3n.org/30-years-of-quicken/feed/ 2 116851
Investing as a Christian in a World of Woke Proxy Voting https://b3n.org/investing-as-a-christian-in-a-world-of-woke-proxy-voting/ https://b3n.org/investing-as-a-christian-in-a-world-of-woke-proxy-voting/#comments Sat, 04 Nov 2023 17:32:58 +0000 https://b3n.org/?p=116901 Over the last few years, anti-Christian activists have influenced asset managers and are pushing ungodly agendas. If you invest in Mutual Funds or ETFs, your Fund Managers may be voting on your behalf, against your values. I wonder if Christian shareholders are aware their votes are pushing Marxist, social gospel resolutions. If we voted in ... Read more

The post Investing as a Christian in a World of Woke Proxy Voting appeared first on b3n.org.

]]>
Over the last few years, anti-Christian activists have influenced asset managers and are pushing ungodly agendas. If you invest in Mutual Funds or ETFs, your Fund Managers may be voting on your behalf, against your values.

I wonder if Christian shareholders are aware their votes are pushing Marxist, social gospel resolutions. If we voted in a bloc for board members based on Biblical values, perhaps the blatant ungodliness we’ve seen at Bud Lite, Target, Ford, Disney, etc., would not even exist at all.

Proverbs 27:23 LSB – Know well the condition of your flocks, and pay attention to your herds.

Over the last five years, as I learned this was becoming a larger issue, I’ve been thinking of some good ways to invest from a Christian perspective.

Doing a quick search reveals Faith-Based funds. The Timothy Plan, Global X by SEI, and Inspire. These funds try to avoid “sin” stocks by screening out companies that promote abortion, homosexuality, cross-dressing, alcohol (which is not even a sin), tobacco (again..), etc. But I’m less interested in avoiding “sin” stocks. My investment strategy is to own the market. What I want to know is, for the funds you hold, are the fund managers using your proxy vote to support Biblical or ESG values? I couldn’t find a lot of information on this until earlier this year.

Pension Politics recently released a report on how Fund Companies Proxy Vote. The report grades Fund Managers on how they vote from A to F. The closer to A, the more the fund managers voted against ESG policies, and the companies rated closer to F voted for the ESG policies. Looking through the report, I took note of some of the larger players:

  • PRIMECAP – A (perfect voting score)
  • Dimensional – A
  • Vanguard – A (of the largest three fund managers, Vanguard has the highest grade)
  • Fidelity – A
  • Blackrock – C (for all the bad press Blackrock gets, they aren’t the worst)
  • JP Morgan – C (no surprise there)
  • Charles Schwab – D (Chuck, what happened?!)
  • State Street – D

Remember the “Christian” Faith-Based funds I mentioned earlier? I looked through all the fund families highlighted on the Faith Driven Investor site. I couldn’t find information on all of them, but of the ones in the report:

  • The Timothy Plan got an F.
  • SEI, which runs the Global X Catholic Values fund, got a D-.
  • GuideStone – F.
  • Praxis Mutual Funds (Everance) – F-.
  • Knights of Columbus – F-.

Dave Ramsey, who is promoted in many churches, recommends SmartVestor Pro advisors, which sells Alger funds…

  • Alger – F- (worst possible score voting for every woke proposal).

As an aside, many of the Alger funds have enormous fees; if you are ever tempted to buy such a fund, you may want to read Where Are the Customers’ Yachts? by Fred Schwed Jr. and Enough by John C. Bogle.

So, while Faith-Based investing funds appear to be pious in their avoidance of “sin” stocks, they vote to turn the stocks they do hold into social justice companies.

There is one exception (Update 2023-11-05) are two exceptions. Inspire and CrossMark Global Investments:

  1. The Inspire, which got an A grade with a perfect voting record. They aim to be Biblically responsible, and they do vote according to Biblical values. If the 0.35% net ER (Expense Ratio) were a bit lower, I’d consider holding their BIBL fund.
  2. CrossMark Global Investments – also got a perfect A. Now, some of the share classes of the fund have load fees, and the fund expenses are high (see my note on Alger); the Inspire ETF expenses are more reasonable.

Next time you’re about to invest, read the report from Pension Politics to see how your fund manager votes.

In the past, I’ve preferred to hold index funds from Vanguard, Schwab, Fidelity, State Street, and Blackrock without considering how they vote on my behalf; I have adjusted my IPS to prefer fund managers that are more likely to vote in the way I would.

Finally, while it is wise and good stewardship to verify your fund managers are representing your values, the fact that woke resolutions can even be successful is a symptom of a much larger problem, and that is simply that people are ungodly. Unless the hearts of men change towards God, which is only possible through the Gospel, any other form of resistance to ungodly shareholder resolutions is futile.

I’ve made the assumption that the reader already understands the dangers of Wokism and ESG; this has been well-defended and documented elsewhere, so I didn’t triple the size of this post to do that. But if you are interested, see MacArthur’s article on The Injustice of Social Justice and listen to A Biblical Theology of Climate Change at the Just Thinking podcast.

.

The post Investing as a Christian in a World of Woke Proxy Voting appeared first on b3n.org.

]]>
https://b3n.org/investing-as-a-christian-in-a-world-of-woke-proxy-voting/feed/ 3 116901
22182 Notes. Evernote to Apple Notes (Fail) to DEVONthink. https://b3n.org/22182-notes-evernote-to-apple-notes-to-devonthink/ https://b3n.org/22182-notes-evernote-to-apple-notes-to-devonthink/#comments Sat, 12 Aug 2023 16:11:09 +0000 https://b3n.org/?p=117060 I’m looking for a scan of a blue note, and I simply can’t find it in Evernote. Evernote versions 4 and 5 were the prime of Evernote. Kris and I used it extensively as our document management system, document knowledge management, and notetaking application. It was a great way to organize things. Every note, every ... Read more

The post 22182 Notes. Evernote to Apple Notes (Fail) to DEVONthink. appeared first on b3n.org.

]]>
I’m looking for a scan of a blue note, and I simply can’t find it in Evernote.

Evernote versions 4 and 5 were the prime of Evernote. Kris and I used it extensively as our document management system, document knowledge management, and notetaking application. It was a great way to organize things. Every note, every important piece of paper, we scanned into Evernote.

I’ve been a huge fan of Evernote, but over the last 6 years the product has become unusable. I suffered through it, but I really want to find my blue note. So, I moved everything to Apple Notes. Apple Notes couldn’t handle my 22182 notes without running out of storage, so I moved everything to DEVONthink (but left a few thousand with Apple Notes) and that seems to be working.

DEVONthink

If Evernote could have simply stopped making their program worse, and instead focused on making version 4 or 5 robust it would still be the most loved notetaking application today.

Application Tip #1. Don’t destroy yourself.

All the ways Evernote Destroyed Themselves since version 4 and 5

  1. Removed the Atlas feature. Now we can’t visually see where we created notes (I know where I scanned that blue note so if I had this feature I could zoom in on the location).
  2. Moving from Native C# and Objective-C apps to Electron. Electron is the worst possible way to write a program. The one thing that made Evernote successful was they released native desktop apps when everyone else was releasing webapps. They took their biggest competitive advantage and shredded it. I do not like Electron.

Application Tip #2. You may think rewriting your app in Electron is a good idea. You are wrong.

  1. Performance on desktop apps and mobile apps got significantly worse (because of Electron). 20,000 notes is not that big, and when I have to wait 3 minutes for the app to respond and I’m trying to take a quick note while on the go the moment is gone and lost.
  2. Removed the thumbnail view–if I’m looking for that rectangle-shaped blue document I could easily find it via thumbnail. Snippets are not thumbnails.
  3. Capped exporting notes to 100 at a time (used to be unlimited).
  4. Can’t select more than 100 notes at once (used to be unlimited).

  5. Removed the local copy of the remote database. So… combined with the 100 export cap, just how are we supposed to backup Evernote in under 600 clicks?
  6. Removed all of the OS integration. This prevents people from printing directly to Evernote or doing manipulation via AppleScript.
  7. Removed API access.

  8. Added a chat – why does a notetaking app need chat?
  9. Used RC4 (I am not making this up) for inside note “encryption”.
  10. Never implemented Encrypted Notebooks.
  11. Never allowed versioned notes to persist across notebooks.

  12. Removed offline cache. Evernote now takes forever to load.
  13. And finally… after making all those bad changes. I just got a notice that my subscription is increasing from $70/person/year to $170/person/year. We’re already pressured from inflation, it’s hard to justify paying a huge increase like that. Evernote argues they’ve added new features to justify the cost, but there have been zero new features that I want and many of the features I did want have been taken away!

Evernote to Apple Notes Migration (Fail)

Apple Notes iOS

So, I attempted to Migrate all our notes to Apple Notes. The new Evernote client only lets you export 100 notes at a time. I’m not going to manually create 220 export files, so I downloaded an old unsupported version of Evernote which lets you export everything at once. I created an Export file per Notebook and imported each into Apple Notes. The notes all imported perfectly which surprised me. I imported about 4,000 notes per day and then let Apple Notes catch up.

After importing, my laptop and phone got hot from trying to process all of those notes. For each set of 4,000 notes my laptop which normally lasts several days ran out of battery within a few hours, and I had to charge my phone (which normally lasts 2 days) multiple times per day. After a week or so, and a few reboots, Apple Notes finally settled down and it was looking good.

Application Tip #3. Not every single note needs to exist on the mobile device.

But, I ran into one problem. I had a final batch of notes to import. I found out Apple Notes is not storage efficient. My 30GB Evernote Database became 60Gb and my iPhone ran out of space. Apple Notes claimed it was taking up 418GB (I don’t think that’s right). My 64GB iPad had no hope at all, and my MacBook also started running out of space. I left my phone in this state for several days to see if it would clear…but it never could sync all the data.

Apple Notes out of space on iPhone

One of the big problems with Apple Notes is there is no way to exclude certain folders from syncing to your mobile devices. If I could have brought in a few thousand notes to the mobile devices, that would have been enough to have access on the phone and the rest would have been fine on the MacBook.

Rule #1 for Apple Notes. Always buy the devices with a lot of storage capacity.

I think Apple Notes would have worked if I had more capacity. I think it could handle my 22,000 notes if I had more storage. But I would have to upgrade 2 MacBooks, 2 iPads, and 2 iPhones to the 1TB models. That starts to get a little expensive.

That said it has good OCR, you can scan to a document using your phone. There is not a good way to scan from my SnanSnap into Apple Notes, but I could have automated it. There is also not really a good clipper.

DEVONthink

Ultimately I decided to leave only 2,000 notes in Apple Notes, and move the rest to DEVONthink. DEVONthink Pro is $200 for the MacOS (2 device license which is perfect for me and Kris) and $50 for the mobile app (family sharing). This is actually cheaper than Evernote considering DEVONthink is a perpetual license. I expect to pay for a discounted upgrade every 5-7 or so years. It’s certainly going to be less than $170/person/year.

DEVONthink

I had run across DEVONthink nearly a decade ago and decided to take a second look and saw the product has improved significantly. I also like to support local businesses, DEVONthink is located in Couer’d Alene, Idaho.

Import Process

DEVONthink

Application Tip #4. If people keep having to download the legacy version of your app to do basic things, like export their data, you are doing something wrong.

The import process is to install the Legacy version of Evernote and let it fully sync to your computer, then DEVONthink will simply bring in every note perfectly. Every piece of metadata is preserved. Geotagging, source url, created/modified dates, etc. I found no mistakes at all.

It took about half a day for DEVONthink to process all the new notes. It was indexing, running OCR, generating thumbnails, probably going some AI stuff, etc. But once complete, it is fast.

A few notes on DEVONthink

  • DEVONthink Pro comes bundled with Abby Finereader’s OCR so I set it up to automatically OCR every PDF that comes in.
  • Sync: supports iCloud (CloudKit), Dropbox, CloudMe, or WebDAV. I started with CloudKit but found you can’t share it with your wife, so I ended up setting up the sync on our WebDAV on our TrueNAS server.
  • Mobile devices can search the entire database (including OCRed documents) without pulling down all the entire database. You can sync the entire database. I set mine to keep the last 100 opened notes on the device so it doesn’t take up much space.
  • For backups and versioning, the entire database is stored locally on MacOS so it is still backed up to iCloud and TimeMachine. Also it’s easy to backup via Cloud Sync using TrueNAS’s tools and version using ZFS snapshots.
  • DEVONthink databases are actually available as a filesystem in Finder and indexed in Spotlight.
  • Tags on the MacOS filesystem are available in DEVONthink, and tags in DEVONthink are available to MacOS. So if you tag a note in DT it shows up in a finder search for that tag.
  • DEVONthink has it’s own database, but you can also add folders from the MacOS filesystem (so files act like they’re in DEVONThink without moving them) so I can access my filesystem from DEVONthink as well. This is actually incredibly convenient with AI.
  • AI Classification. Any new document coming in can be automatically filed. The AI learns your filing method and is pretty accurate. In fact, it files things better than I do because I sometimes forget about the folders. The AI Classification can also be used to file documents on the filesystem.
  • AI finds similar documents. Open any document and the DEVONthink AI finds similar or related documents (even if no keywords are shared). I’m impressed at how good it is.
  • Annotations not as good as Evernote. In Evernote, the PDF was always embedded in a note. You can create a new note and link it, but that’s a bit tedious. DEVONthink annotations work but are not as visible as Evernote notes.
  • No automatic versioning. DEVONthink has no document versioning and certain actions cannot be undone. That said since it’s all stored on TimeMachine you can get to previous versions but it’s not as good as Evernote’s versioning which is well integrated and easy to restore from.
  • It is fast compared to Evernote or Apple Notes. DEVONthink is instant when creating notes.
  • There’s no mobile document scan. There is the camera to take a photo but that’s not very good for documents. You can do a scan from iOS Files and add that to DEVONthink but that’s cumbersome compared to Evernote or Apple Notes document scan which uses the camera then processes it as a document.
  • Multiple database support–so you can separate personal and work, or different major projects. I ended up creating one for me and Kris, and one for work, and one that just references my filesystem documents.
  • Automatic Geotagging works perfect. It also imported all the geotagging from Evernote, but now I have an Atlas view so I can zoom in on a location and see the notes I created there.
DEVONthink Atlas
  • The Notes themselves are not as good Apple Notes or Evernote. You can create a note, but there is no concept of some features I’d expect like checklists in DEVONthink. Ultimately I decided to use Apple Notes for most notes (especially if they have action-items) and DEVONthink for documents and more reference type notes (no action-items).
  • You can import Apple Notes into DEVONthink which makes it a great way to archive them.
  • Can import / archive emails from Apple Mail or Outlook
  • Extremely fast. I can load all 22,000 notes and scroll through them with no lag.

Application Tip #5. Always pre-generate all the thumbnails to allow for fast scrolling

  • Encryption. Data is encrypted on cloud storage. So all data is encrypted with a key before being loaded to the WebDAV server. Additionally the DEVONthink databases can be encrypted locally with a key so if for some reason you can’t enable FileVault and Advanced Data Protection you can still fully encrypt the database.
  • Onboard PDF editing. It is so nice to be able to edit a PDF (rotate or flip a page, re-order pages, delete a blank page, etc.). This is a feature lacking in Evernote.
  • All of the features are local. There is no cloud service you are relying on. Even AI processing is all done using Apple’s onboard processors. It puts Evernote’s model of being entirely cloud-based with no encryption to shame.
  • One other drawback to DEVONthink is it’s not really meant to be a multi-user application. You can share it with a handful of users…but if I had more than 5 users I’d be looking at something else that had better a revisioning/recovery and per note or per folder access controls.
  • The Evernote clipper is bar-none the best web clipper. DEVONthink’s web clipper isn’t terrible, but it could be a lot better.
  • DEVONthink is fairly complex. I’d say it has a steeper learning curve than Evernote, but in the long-run it will save you time.

Overall it is much better than Evernote. But I’d like to see five features added:

  1. Add checklists to the notes
  2. A better way to do annotations or embed PDFs into notes like what Evernote does. Grouping things together is not the same.
  3. Add support for iCloud CloudKit database sharing with multiple users.
  4. Better Clipper.
  5. Better camera document scanner.

But those are fairly minor. I mean, just scrolling through the thumbnails (that Evernote took away) is like flipping through a file folder. I’m a visual person. I don’t always know what I named a note or what keywords to search for. But I remember the shape and color, so it’s nice to quickly scan through my notes visually.

And look! There’s the blue note I was looking for.

DEVONthink notes

The post 22182 Notes. Evernote to Apple Notes (Fail) to DEVONthink. appeared first on b3n.org.

]]>
https://b3n.org/22182-notes-evernote-to-apple-notes-to-devonthink/feed/ 2 117060
Bethany Ruth Bryan – Sep 16, 1990 to June 23, 2023 https://b3n.org/bethany/ https://b3n.org/bethany/#comments Thu, 06 Jul 2023 14:57:01 +0000 https://b3n.org/?p=116871 We ran after her down the hall and she broke into a run, giggling trying to get away from us. Bethany and her Mardi Gras beads She cried, she laughed, and she loved to be chased. My siblings and I played with her not too differently than we’d play with each other. She never grew ... Read more

The post Bethany Ruth Bryan – Sep 16, 1990 to June 23, 2023 appeared first on b3n.org.

]]>
We ran after her down the hall and she broke into a run, giggling trying to get away from us.

Bethany and Mardi Gras Beads

Bethany and her Mardi Gras beads

She cried, she laughed, and she loved to be chased. My siblings and I played with her not too differently than we’d play with each other. She never grew up the way most of us do. Mental issues like she had are less noticeable when you’re young and only manifest themselves gradually. From my perspective as a kid, there was no sudden realization that she was different; by the time I realized it, it was already normal.

My sister, Bethany, passed away at 32. She was number 4 of 5. She was mentally handicapped, never learning to talk, but she was a lot of fun.

Noelle, Jon, Bethany, Sarah, Ben

Noelle, Jon, Bethany, Sarah, Ben

For entertainment, she liked to hold something and shake it, her favorite being Mardi Gras beads. We had a couple of big soft blue rocking chairs, and when she was little, she’d stand on the chair and make it rock. This didn’t work out well when she got taller. At first, it only happened when she got excited, but it started becoming a more common occurrence that we’d find her lying down in the chair tipped over backward! I don’t remember for sure, but I think Mom and Dad had to get rid of the rocking chairs so she wouldn’t hurt herself.

As we all got older, her condition became more apparent, and there wasn’t much that doctors or specialists could do. I sometimes had thoughts about why this happened to Bethany in our family and not someone in another family. I wrongly assumed this was caused by random chance as a result of living in a fallen world. Now I firmly believe her condition was no random event caused by the cosmic dice; but rather by God’s sovereign will. The reason for this, I probably won’t know on this side of eternity. But we can look to the man born blind as an example and simply conclude that God has a purpose and will use this for His glory, but like the parents of the man born blind, they didn’t know why for much of his life:

John 9:3 ESV – It was not that this man sinned, or his parents, but that the works of God might be displayed in him.

She became less stable in walking. She became afraid of floor transitions and took giant steps over mere shadows. She couldn’t run anymore and often lost her balance and fell. She wouldn’t walk in some areas unless she was led by hand.

Bethany & Noelle

Bethany and Noelle

Even though she could not communicate with words–she still communicated. She recognized our voices, responded to touch, would walk into the kitchen when she was hungry, and we would hear Bethany humming all day. While she would sometimes enjoy some commotion and even relish the sound of her younger sister crying, she was an introvert. And often just needed to sit on a couch with one other person in the back room.

Her condition impacted me in three ways:

First off, as Bethany got older, it limited the external things we could do as a family just because of the amount of effort needed. The family excursions became less frequent. Someone always had to watch Bethany, whether it was going to church, traveling, or running errands. Sometimes it was rough, but other times I enjoyed taking her on short walks or just sitting on the couch as she hummed.

Second. She helped teach me responsibility. Sometimes I’d have to miss a weekday Bible study or some activity with my friends because I was on Bethany duty. I didn’t feel sorry for myself at all. I just knew it was my job to take care of her, and I was happy to be of service and often volunteered my siblings volunteered to help.

Mom and Bethany

Third, I saw the sacrifice of my parents, especially my mom, in taking care of her. The importance of human life is not in our intelligence or capability but in our being created in the image of God. I saw my parents treat her with respect and dignity. She was just as important as the rest of us. In one sense, you could say that Bethany limited the ministries we could be involved in; in another way, this ministry to Bethany was more important than anything else we could be doing because this was the ministry God entrusted to us.

Mom, Dad, Bethany, and Pops

Dad, Mom, Bethany, and Pops

I do not say this to be sentimental or comforting, but I say it based on a biblical view of soteriology. It would not be consistent with God’s character as revealed throughout scripture if Bethany were in any other place right now than the very presence of God singing praises to Him. Not based on her own merit or holiness, for she was born a sinner. However, her condition was she could not understand good and evil or have a comprehension of the Gospel. She never matured to a state of responsibility to respond to the gospel and believe in Christ. Rather, her salvation is based on God’s sovereign grace, the blood of Christ covering her sin.

Bethany is now in the presence of our Lord, Jesus Christ. The first person she ever spoke to clearly, is her Savior. I am honored and blessed to be counted as one of her family. I will miss her.

Bethany & Pops

Psalm 139:23-24 LSB:

Search me, O God, and know my heart!
    Try me and know my thoughts!
And see if there be any grievous way in me,
    and lead me in the way everlasting!

P.S.
Mom’s blog posts with memories of Bethany
Bethany’s Obituary
– Video of Bethany’s Memorial Service (service by Wayne Reynolds and thoughts from Jon, myself, Sarah and a number of friends).
Torie’s Poem
Jon’s Poem
– Noelle’s painting of Bethany:


The post Bethany Ruth Bryan – Sep 16, 1990 to June 23, 2023 appeared first on b3n.org.

]]>
https://b3n.org/bethany/feed/ 16 116871
Engineering WordPress for 10,000 Visitors per Second https://b3n.org/engineering-wordpress-for-10000-visitors-per-second/ https://b3n.org/engineering-wordpress-for-10000-visitors-per-second/#respond Sat, 17 Jun 2023 15:09:09 +0000 https://b3n.org/?p=114970 Here’s how I configured my WordPress server to handle huge traffic spikes. It’s easier than you think. For those who have seen smoke coming from your server in the garage, you know. But for those who haven’t, here’s a bit of history as I remember it: In the 1990s Slashdot used to link to interesting ... Read more

The post Engineering WordPress for 10,000 Visitors per Second appeared first on b3n.org.

]]>
Here’s how I configured my WordPress server to handle huge traffic spikes. It’s easier than you think.

For those who have seen smoke coming from your server in the garage, you know.

But for those who haven’t, here’s a bit of history as I remember it:

In the 1990s Slashdot used to link to interesting websites and blogs. If you wrote about using Linux as a NAT router for your entire University, assembled a huge aircraft carrier out of Legos, built a nuclear reactor in your basement, or made a roller coaster in your backyard; you’d end up on Slashdot.

Slashdotted. The problem is Slashdot became so popular and drove so much traffic to small websites, it started crashing sites just by linking to them! It was similar to a denial of service (DoS) attack. This is the “Slashdot Effect“.

Twenty years later, a number of sites have been known to generate enough traffic to take a site down. Drudge Report, Reddit, Twitter, etc. This is notoriously known as the “Internet Hug of Death“.

There are plenty of hosting providers that will charge $2,000/month to handle this kind of load. But I’m a bit thrifty; it’s simple and inexpensive to engineer for this kind of traffic.

Small traffic spike from Hacker News

Here are the four steps I took to handle traffic spikes:

Step 1. Get a fast physical server

Although, I think step 4 would let you get away with a Pi, it doesn’t hurt to have a fast server. I have this site configured in a VM with 4-cores on a modern Xeon CPU and 8GB memory, which seems to be plenty, if not overkill. The physical host has 28 cores and 512GB memory, so I can vertically scale quite a bit if needed. Very little traffic actually hits the server because I use Cloudflare, but I like it to be able to handle the traffic just in case Cloudflare has problems. I run the server on Ubuntu 20.04 LTS under Proxmox.

Step 2. Use an efficient web server

I learned the hard way that Apache runs out of CPU and memory when handling large concurrent loads. NGINX or OpenLiteSpeed are much better at serving a large number of simultaneous requests. I use OpenLiteSpeed, because it integrates well with the LiteSpeed Cache WordPress plugin. I believe LiteSpeed Cache is the most comprehensive WordPress caching system that doesn’t cost a dime.

Step 3. Page caching.

Use a page cache like WP-Rocket, FlyingPress, or if you’re cheap like me, LiteSpeed Cache to reduce load. This turns dynamic pages generated from PHP into pre-generated static pages ready to be served.

Now, just these three steps are enough to handle a front-page hit on Reddit, Slashdot, Twitter or Hacker News. Some sites can generate around 200 visitors per second (or 120,000 visitors per minute at peak). But it’s better to overbuild than regret it later which brings us to step 4…

A general rule of thumb is to overengineer websites by 100x.

Step 4. Edge Caching

A final layer to ensure survival is to use a proxy CDN such as Cloudflare, QUIC.cloud, or BunnyCDN. Those take the load off your origin server and serve cached dynamic content from edge locations. I use Cloudflare. Cloudflare has so many locations you’re within 50ms of most of the population–this site gets cached at these locations:

I configured Cloudflare to do Page Caching, and use Cache Rules instead of Page Rules following the CF Cache Rules Implementation Guide (I don’t user Cloudflare Super Cache, but their guide works fantastic with LiteSpeed Cache). This allows you to cache dynamic content while making sure not to cache data for logged-in users.

A Warning about CDNs — so I’ve tried to use CDNs to optimize and host images in the past, but CDNs seem to have problems delivering images under heavy load. So I host images myself and use ShortPixel’s Optimizer to pre-generate and store multiple optimized copies of each image. This seems more reliable for my scenario. Cloudflare still caches images but not generated on the fly.

Reserve Caching. I enabled an additional layer, CF Reserve Caching, which saves items evicted from the cache into R2 Storage as a layer 3 cache. This is pretty inexpensive, my monthly bill ends up being a little over $5 for this service, but it takes a huge load off my origin server.

As a result, I see a 95% cache hit ratio. And if it misses, it’s just going to hit the LiteSpeed cache, and if that misses, Memcached–so there’s minimal load on the server.

There are actually quite a number of caching layers in play here:

  1. The Browser cache (if the visitor has recently been to my site)
  2. Cloudflare T2 – nearest POP (Point of Presence) location to visitor
  3. Cloudflare T1 – nearest POP (Point of Presence) location to my server
  4. Cloudflare R2 – Reserve cache
  5. LS Cache – LiteSpeed cache on my server
  6. Memcached – Memcached (for database queries)
  7. ZFS L2 ARC – Level 2 Cache on SSDs on my server
  8. ZFS ARC – Level 1 Cache in memory on my server

If all those caching layers are empty, it goes to spinning rust.

Browser --> CF T2 --> CF T1 --> CF R2 --> origin --> LSCache --> Memcached --> ZFS L2ARC --> ZFS ARC --> spinning rust.

Load Testing

loader.io lets you test a website with a load of 10,000 clients per second. I tested my setup, you can see the load was handled gracefully with a 15ms response time.

The egress rate is 10.93Gbps (ten times faster than my fiber connection).

I could probably optimize it a little more, but this already qualifies for 100x overengineered, and we’re way past the 80/20 rule. Good enough.

To handle a hug of death, you’ll want:

  1. Beefy Hardware
  2. Modern Webserver
  3. Page Caching
  4. Edge Caching

Ecclesiastes 1:7 ESV –
All streams run to the sea,
but the sea is not full;
to the place where the streams flow,
there they flow again.

The post Engineering WordPress for 10,000 Visitors per Second appeared first on b3n.org.

]]>
https://b3n.org/engineering-wordpress-for-10000-visitors-per-second/feed/ 0 114970
2023-06-01 06:18:00 https://b3n.org/2023-06-01-061800/ https://b3n.org/2023-06-01-061800/#respond Thu, 01 Jun 2023 13:19:35 +0000 https://b3n.org/?p=116660 2023-06-01 06:18:00 Genesis 1:27 ESV – So God created man in his own image,    in the image of God he created him;    male and female he created them.

The post 2023-06-01 06:18:00 appeared first on b3n.org.

]]>
2023-06-01 06:18:00

Genesis 1:27 ESV –
So God created man in his own image,
    in the image of God he created him;
    male and female he created them.

The post 2023-06-01 06:18:00 appeared first on b3n.org.

]]>
https://b3n.org/2023-06-01-061800/feed/ 0 116660
Supermicro Fan Speed Script https://b3n.org/supermicro-fan-speed-script/ https://b3n.org/supermicro-fan-speed-script/#comments Sat, 15 Apr 2023 15:37:14 +0000 https://b3n.org/?p=116214 2U Supermicro servers are my go-to. These are much quieter than 1U servers, but the fans spin at 8800RPM. The IPMI fan modes available are Full (9000RPM), Heavy IO (6000RPM), and Optimal (supposed to auto-adjust). Unfortunately, setting the Fan Mode to Optimal seems to have a floor speed of 4500RPM, which is too loud. Even ... Read more

The post Supermicro Fan Speed Script appeared first on b3n.org.

]]>
2U Supermicro servers are my go-to. These are much quieter than 1U servers, but the fans spin at 8800RPM. The IPMI fan modes available are Full (9000RPM), Heavy IO (6000RPM), and Optimal (supposed to auto-adjust). Unfortunately, setting the Fan Mode to Optimal seems to have a floor speed of 4500RPM, which is too loud. Even though my servers are in the garage, I sometimes work on projects there and can’t have it that noisy!

In the past, I’ve replaced Supermicro fans with low RPM Antec or Noctua fans, but since I moved my homelab to the garage I don’t need it to be that quiet.

Here’s my script, which tries to control the fan speed such that the noise is below that of a jet engine and the CPU temperature is below 50C. To avoid hunting, I set a minimum and maximum CPU range between 40-50C. The logic is simple:

  1. Set FAN speed mode to Full (this allows manual control of the fans).
  2. If the CPU is above 50C, bump the speed up by 10%
  3. If the CPU is below 40C, drop fan speed by 1%
  4. Repeat steps 2 and 3 every minute

    ⚠️ WARNING 1: Using this script could overheat and damage your CPU and other components; your server may release the magic smoke, burn through crystals like you’re traveling at Warp 10, or catch fire due to using this script. I just wrote it yesterday, so there may be bugs. I advise you not to use this script. If you go against my advice, keep a close eye on your temperatures.

    ⚠️ WARNING 2: This script was designed for a Supermicro X10 Motherboard. Yours may have a different fan configuration, in which case you will want to add or change the ipmitool commands.
Supermicro 2U Server with Four Fans

# apt install python3
# apt install ipmitools

# vim /usr/local/bin/fan_control.py

#!/usr/bin/env python3
import os
import subprocess
import time
import syslog
import re

# Set your desired temperature range and minimum fan speed
MIN_TEMP = 40
MAX_TEMP = 50
MIN_FAN_SPEED = 5  # Sets an initial fan speed of 5% 
current_fan_speed = MIN_FAN_SPEED

# IPMI tool command to set the fan control mode to manual (Full)
os.system("ipmitool raw 0x30 0x45 0x01 0x01")
time.sleep(2)

# Get the current CPU temperature
def get_cpu_temperature():
    temp_output = subprocess.check_output("ipmitool sdr type temperature", shell=True).decode()
    cpu_temp_lines = [line for line in temp_output.split("\n") if "CPU" in line and "degrees" in line]

    if cpu_temp_lines:
        cpu_temps = [int(re.search(r'\d+(?= degrees)', line).group()) for line in cpu_temp_lines if re.search(r'\d+(?= degrees)', line)]
        avg_cpu_temp = sum(cpu_temps) // len(cpu_temps)
        return avg_cpu_temp
    else:
        print("Failed to retrieve CPU temperature.")
        return None

# Set the fan speed
def set_fan_speed(speed):
    global current_fan_speed
    current_fan_speed = speed

    # Convert the speed percentage to a hex value
    hex_speed = format(speed * 255 // 100, "02x")


    # Set the fan speed for all 4 zones
    os.system(f"ipmitool raw 0x30 0x70 0x66 0x01 0x00 0x{hex_speed}")
    time.sleep(2)
    os.system(f"ipmitool raw 0x30 0x70 0x66 0x01 0x01 0x{hex_speed}")
    time.sleep(2)


    # Log the fan speed change to syslog
    syslog.syslog(syslog.LOG_INFO, f"Fan speed adjusted to {speed}%")

    # Print the fan speed change to console
    print(f"Fan speed adjusted to {speed}% - {hex_speed}")

# Set initial minimum fan speed
set_fan_speed(MIN_FAN_SPEED)

while True:


    cpu_temp = get_cpu_temperature()

    # Print the current CPU temperature to console
    print(f"Current CPU temperature: {cpu_temp}°C")

    if cpu_temp > MAX_TEMP and current_fan_speed < 100:
        # Increase the fan speed by 10% to cool down the CPU
        new_fan_speed = min(current_fan_speed + 10, 100)
        set_fan_speed(new_fan_speed)
    elif cpu_temp < MIN_TEMP and current_fan_speed > MIN_FAN_SPEED:
        # Decrease the fan speed by 1% if the temperature is below the minimum threshold
        new_fan_speed = max(current_fan_speed - 1, MIN_FAN_SPEED)
        set_fan_speed(new_fan_speed)

    # Wait for 60 seconds before checking the temperature again
    time.sleep(60)

# vim /etc/systemd/system/fan_control.service :

[Unit]
Description=Fan Controller Service
After=network.target

[Service]
Type=simple
User=root
ExecStart=/usr/bin/python3 /usr/local/bin/fan_control.py
Restart=on-failure

[Install]
WantedBy=multi-user.target

Set file permissions:
# chmod 755 /usr/local/bin/fan_control.py
# systemctl daemon-reload
# systemctl enable fan_control.service
# systemctl start fan_control.service
# systemctl status fan_control.service

Bugs: so, I’ve come across one bug where the script won’t spin down both fan zones, but killing the script and running it a second or third time seems to work. I suspect this is from sending ipmi commands too fast, so I’ve added sleep delays, but because it’s an intermittent issue who knows if that fixed it.

Improvement ideas: It would be great to add temperature monitoring for the other components (HDD, etc.). But in my observation, if I can keep the CPU cool the rest of the components are okay.

To watch the status, check IPMI or tail syslog:

Temperature readings all green
Fan Speed readings around 3000RPM

# tail -F /var/log/syslog |grep -E ‘speed|temp’

Fan speed adjusted to 4% - 0a
Current CPU temperature: 38°C
Fan speed adjusted to 3% - 07
Current CPU temperature: 39°C
Fan speed adjusted to 2% - 05
Current CPU temperature: 39°C
Fan speed adjusted to 1% - 02
...
Current CPU temperature: 48°C
Current CPU temperature: 50°C
Current CPU temperature: 52°C
Fan speed adjusted to 11%
Current CPU temperature: 51°C
Current CPU temperature: 48°C
Current CPU temperature: 46°C

That was set up last night.

When I woke up this morning, the fans were running at a pleasant hum of 1800 RPM.

He who blesses his friend with a loud voice early in the morning, It will be counted as a curse to him. – Proverbs 27:14 LSB

The post Supermicro Fan Speed Script appeared first on b3n.org.

]]>
https://b3n.org/supermicro-fan-speed-script/feed/ 2 116214
BackBlaze B2 vs AWS S3 Intelligent Tiering | NAS Backups https://b3n.org/b2-vs-s3-nas-backup/ https://b3n.org/b2-vs-s3-nas-backup/#respond Fri, 31 Mar 2023 13:51:45 +0000 https://b3n.org/?p=114396 Good morning. Today is March 31st. 🌐 It’s World Backup Day. This is a good day to review your backup strategy. I thought I’d share a cloud backup experiment I’ve been running for 9-months. I hope you find it helpful. I needed a cloud service for my offsite backups… the two prominent services are BackBlaze ... Read more

The post BackBlaze B2 vs AWS S3 Intelligent Tiering | NAS Backups appeared first on b3n.org.

]]>
Good morning. Today is March 31st. 🌐 It’s World Backup Day. This is a good day to review your backup strategy. I thought I’d share a cloud backup experiment I’ve been running for 9-months. I hope you find it helpful.

I needed a cloud service for my offsite backups… the two prominent services are BackBlaze B2 and Amazon S3. I wasn’t sure which would be cheaper. But there’s one way to find out.

For the last 9 months I’ve been backing up my TrueNAS data to both BackBlaze B2 and AWS S3.

About the NAS Data

My TrueNAS Data that I’m backing up to the cloud consists of the following:

  • 1.39 TB Archive data (mostly a collection of data archived in lots of .7z compressed files organized by year with the last couple of years uncompressed)
  • 1.65 TB SMB share (my Automatic Ripping Machine target is here, and lots of small documents and files as well).
  • 0.46 TB NFS share to Proxmox Backup Server. This data is lots of small backup files that can reproduce VMs and Container block storage and probably changes a lot.

BackBlaze B2 Target

BackBlaze is a simple $5/TB/month with no ingress fees. Here’s the B2 Pricing page. Small egress fees to restore. Pretty straightforward.

There are also transaction fees (based on the number of API calls), but these are pretty minimal if using the rclone –fast-list option, so you can almost ignore them.

Amazon AWS S3 Target

Amazon S3 is not that simple… here’s the S3 pricing page. You’ve got different storage tiers:

  • S3 Standard $23/TB/month
  • S3 Standard – Infrequent Access – $12/TB/month
  • S3 Glacier Instant Retrieval $4/TB/month
  • S3 Flexible Retrieval $3.60/TB/month
  • S3 Glacier Deep Archive $0.99/TB/month
  • S3 Intelligent – Tiering which automatically moves your data into the best tier based on usage patterns.

Egress from S3 is atrocious; in addition to retrieval fees, data transfer fees to download your data to a location outside Amazon is $90/TB. However, the likelihood of losing two local copies is slim. And if I had a catastrophic event that took out my laptop and local backups, $90/TB would be small potatoes in the grand scheme of things.

S3 Glacier Deep Archive Tier

The AWS Glacier tier of $0.99 cheaper than BackBlaze’s $5, but there is a minimum storage commitment of 180 days (change or delete a file early and you’ll pay for it upfront) and a retrieval delay of up to 12 hours. Plus, the API fees can cost a lot (especially for lots of small files).

At first, I thought I’d backup my archive data (which rarely changes) to Glacier Deep Archive. I got a huge bill… so the Deep Archive tier charges $0.05 per 1,000 requests (put, copy, post, list). When you have a lot of files, this gets expensive fast, so I shut it down. I do keep most older years compressed but I like to keep the last couple of years open and that can end up being hundreds, if not thousands of dollars per month in list requests.

But you can use Intelligent Tiering, which has more reasonable request charges and still lets you lifecycle files into the Glacier Deep Archive Tier after 180 days–bypassing the Deep Archive list charges.

Nine-Month Experiment

Intelligent tiering automates storage tiers and smooths out your costs on S3 a bit. I was curious how it compared to B2. My theory is that BackBlaze would be initially cheaper, but AWS would become less expensive within 6-months as things were moved into S3 Deep Archive. For my data set, it turns out the ROI for using AWS S3 is so far out it’s probably not worth it.

Storage Policy Configuration

  1. For AWS I set up a policy on the Intelligent Tiering Configuration to move files to Deep Archive after 180 days.
  2. For both AWS and BackBlaze, I set a 30-day versioning rule and expired non-current versions after 30 days. This gives me 30 days worth of immutable backups to help protect against corrupted data due to a bug, mistake, or compromise of the TrueNAS system.

TrueNAS Configuration

In June of 2022, I set up TrueNAS’s Cloud Sync Task (which uses rclone) to use both S3 and AWS to backup to the cloud once a week on staggering days.

Results

MonthAWS S3BackBlaze B2AWS RunningBackBlaze Running
Jun-2231.112.3431.112.34
July-2237.9211.5369.0313.87
Aug-2230.6512.2199.6826.08
Sep-2223.6911.65123.3737.73
Oct-2219.9713.26143.3450.99
Nov-2218.0412.55161.3863.54
Dec-2216.8112.82178.1976.36
Jan-2316.4813.95194.6790.31
Feb-2315.7214.75210.39105.06
AWS S3 Intelligent Tiering vs BackBlaze B2 USD Costs with NAS data

9 Month Running Cost: $210.39 (AWS) vs $105.06 (B2).

Even at 9 months, S3 Intelligent Tiering is more expensive than B2’s monthly costs, and the running cost for S3 is double that of B2. Now, at some point, perhaps after 5 or 10 years, there may be an ROI using S3 instead of BackBlaze B2, but perhaps not.

The monthly cost for S3 hasn’t even dropped below B2 yet, and it may never.

If there is savings with S3 that will be negated if I have to restore once. And if I do have to restore, it will take hours to unfreeze files from Glacier Deep Archive. Also, if I ever decide to re-organize my file structure, it’s going to get expensive, resetting back to S3 Standard Pricing for six months.

Now, enough of my data probably changes to keep a good portion in the S3 Standard Tier. I could try to change how my data is stored and optimize it for lower storage costs in S3. But optimizing takes time. If I started considering the value of time to optimize for S3, it just wouldn’t be worth it. The simple option of using B2 is the best choice in my situation.

Other Considerations

A few other things you may want to consider.

  • AWS replicates S3 data to three availability zones within a region. I assume this means three separate buildings somewhat separated. BackBlaze stores “multiple copies” of your data, but as far as I could tell, there’s no promise it’s spread out across multiple zones. I think S3 is more resilient to disaster than B2.
  • BackBlaze seems to be less woke. I haven’t seen any posts or social media promoting leftist agenda as I see all over the place from Amazon. I appreciate a company with the discipline to be focused on their mission.
  • Potential egress costs. AWS egress fees are excessive and designed for vendor lock-in. BackBlaze has reasonable egress fees.
  • BackBlaze is simple. If something were to happen to me, just about anyone could figure out how to restore data from B2 using TrueNAS. AWS is also a very standard setup, but not everyone will know how to restore files that have been lifecycled into deep archive.
  • BackBlaze requires less commitment. If you change your mind or want to re-organize your files, it’s more forgiving than AWS.
  • AWS is not entirely honest with its uptime status. I experienced an AWS outage for a while, and nothing on the AWS status page indicated they were down–but there were plenty of comments on Reddit and HN.

I think your mileage may vary. The cost is dependent on the type of data being stored and how often it changes. In my case B2 seems to be the simple and better option.

Oh, and Happy World Backup Day!

Proverbs 22:3 LSB –
A prudent man sees evil and hides, But the simple pass on, and are punished.

The post BackBlaze B2 vs AWS S3 Intelligent Tiering | NAS Backups appeared first on b3n.org.

]]>
https://b3n.org/b2-vs-s3-nas-backup/feed/ 0 114396
Apple Silicon MacBooks Review https://b3n.org/apple-silicon-macbooks-review/ https://b3n.org/apple-silicon-macbooks-review/#comments Sat, 25 Mar 2023 15:41:03 +0000 https://b3n.org/?p=114911 Our Dell Laptops were EOL (End of Life) for BIOS security updates. This time, I refreshed with MacBooks. I used to use Macs so they’re not new to me. But a decade ago, I ran into issues with a lemon Macbook and Apple being unable to provide support in North Idaho. It takes all day ... Read more

The post Apple Silicon MacBooks Review appeared first on b3n.org.

]]>
Our Dell Laptops were EOL (End of Life) for BIOS security updates. This time, I refreshed with MacBooks.

I used to use Macs so they’re not new to me. But a decade ago, I ran into issues with a lemon Macbook and Apple being unable to provide support in North Idaho. It takes all day to drive down to the nearest Apple store and back. After they had me drive down to Spokane twice, eating up two Saturdays, I decided to use Dell. If you get next-day business support at least Dell will send a tech to your house which is a lot better than driving to Spokane–although Dell seems to be having trouble staffing that service now. Of course, things weren’t all rosy with Dell. After I had them come out to replace a motherboard, hackers got into the Dell support database and tried to scam me.

Anyway, our local Best Buy can now service Macs, and has a generous return policy, so support in Idaho is no longer an issue.

Advanced Data Protection

I considered Macs this time mainly because of my theme to simplify and secure, combined with Apple’s recent announcement of iCloud Advanced Data Protection. An optional feature that you can turn on–it gives you all the benefits of the cloud, but in such a way that you control the encryption keys, and Apple has zero knowledge of your data. Well, most of it. I will not say Apple is perfect; there was the CSAM incident, which Apple decided not to implement after backlash, but the fact that they even considered it is concerning. But of the major consumer cloud providers (Google, Amazon, Microsoft, and Apple), Apple is the only one that has put effort into end-to-end encryption and privacy.

Of Amazon Alexa, Google Assistant, Apple Siri, and Microsoft Cortana, only Siri can be configured to use on-device inference, sending no data to the cloud.

According to the Washington Post, the FBI is Deeply Concerned about Advanced Data Protection:

“This hinders our ability to protect the American people from criminal acts ranging from cyber-attacks and violence against children to drug trafficking, organized crime and terrorism,” an FBI spokesperson told the publication. “In this age of cybersecurity and demands for ‘security by design,’ the FBI and law enforcement partners need ‘lawful access by design.'”

Well, hindering a government’s ability to spy on its citizens is not a bad thing.

I moved Kris to the M2 MacBook Air and myself to the M1 MacBook Pro (there’s a newer M2 Macbook Pro). I equipped both with 16GB memory and 512GB SSD. Honestly, despite the Air being thinner and without fans, the performance difference isn’t perceivable between the M1 Macbook Pro and the M2 Macbook Air. I mainly opted for a Pro to drive multiple external monitors.

Actually come to think of it, I can’t recall the fan on my Macbook Pro ever engaging.

My thoughts on the current Macs

After using them for several months, here are my observations:

  1. The M1/M2 ARM processor is a RISC (Reduced Instruction Set Computer) processor. Since I was a kid, I’ve always been fond of the RISC strategy and was disappointed when almost the entire industry went to CISC (Complex Instruction Set Computer), so I’m happy it’s back. It’s a throwback to the PowerPC. And it yields impressive battery life. Both the M2 Macbook Air and M1 Macbook Pro yield an unexaggerated 15-20 hours per charge. This is the first modern laptop where I’ve been happy without having a dual battery hot-swap system (for those of you who had a ThinkPad PowerBridge, an elegant laptop design made for a more civilized age, you know what I mean).

    Anyway, on the MacBooks, Kris and I get multiple days out of a single charge.

Not once have I had to interrupt my work to find an outlet and put my Macbook on the charger.

  1. Mac OSX has intel emulation for older Intel programs, so they run fine–but at the cost of drawing energy like an Intel CPU. In practice, I only have run across one program that still runs on Intel.

  2. Windows 11 for ARM running on Parallels Pro works better
    virtualized than Windows did on my last bare-metal Dell laptop. I can pass USB and Bluetooth devices to it. Microsoft has officially sanctioned this configuration. I activated Windows 11 Pro ARM using an old Windows 8 Pro Retail key from a desktop I had decommissioned. Parallels also supports hypervisor features such as snapshots and linked clones, and you can run Windows apps alongside Mac OSX apps.

    I was surprised to find ARM binaries for just about every Windows program, including the Microsoft Office suite. But for the few programs that require Intel, Windows has its own Emulation, so even x86 or AMD64 applications can run under Windows ARM under Parallels (at the cost of consuming energy like an Intel chip). In practice, I’ve found very few applications that aren’t compiled for ARM.

  3. Too Cold. 🧊 So, all my Dell laptops make nice lap warmers in the winter. But when it’s -10F outside, and you get a fire going and then get your laptop–you expect something warmer than a freezing cold metal slab. You can’t warm your hands by the laptop vents because the CPU isn’t hot and the fan’s not blowing. The problem is the ARM processor is too efficient. It doesn’t waste heat. Apple should put in heating coils.

  4. Durability. I think Dell Latitudes have a much better chance of surviving a drop than a MacBook. MacBooks are built like an AR-15 designed to be taken care of, and Dells are assembled like an AK-47 which can take a little more abuse.

  5. Polish. Apple pays attention to the little things. I first noticed that you can open the screen with one hand. If you use multiple Apple devices, the integrated experience is bar-none. An iPad becomes a second monitor for a MacBook; an iPhone can become an extra Webcam, read and respond to SMS messages from your computer, handoff Safari browsing between iPhone, iPad, and MacBook. The clipboard is synchronized across all your devices, which makes MFA authentication easier.

    I love composing SMS messages on the computer instead of my phone.

  6. I like having the 16:10 display, even if it has a notch. The notch is not noticeable in dark mode. It feels like the screen is designed to utilize the space maximally. 16:10 is also handy when running Windows virtualized in 16:9 fullscreen mode since it allows you to keep the OSX Menu on top.

  7. The Keyboard is terrible. It’s a little nicer than a Dell Latitude but worse than a ThinkPad. I normally use a mechanical keyboard, so I think all laptop keyboards are bad.

  8. The Audio is decent. I enjoy how full it sounds. FaceTime calls also work well with the built-in microphone without needing an external headset or microphone for noise cancellation.

  9. Docking into multiple external monitors works. So, I read that the low-end M1 Macbook Pro can’t support three external monitors, so I was considering the higher-end one but it was super pricey. But Nick told me that his worked with three monitors. Indeed, I installed the DisplayPort drivers, docked the M1 MacBook Pro right into my Dell docking station, and all three screens worked. Maybe the limitation is without using DisplayPort. 🤷‍♂️

  10. Bluetooth devices work well. On several occasions, I’ve had trouble with my Dell Latitude audio quality getting worse while on a phone call. I often have to switch to my iPhone to continue MS Teams calls until I reboot. I don’t necessarily think this is a Windows problem, but some sort of Dell/Windows Bluetooth driver compatibility issue or perhaps the CPU can’t keep up. However, the Macbook works perfectly (it may be because they have newer BT chips).

Backups (watch out for Optimized settings)

My rule for backups is to have at least three copies. The original on the MacBook, one offsite, and at least one immutable. I have two backup destinations for our MacBooks: 1. Everything is synced to iCloud, and 2. I backup to my TrueNAS server using TimeMachine. TrueNAS gives us immutable snapshots, and robust versioning.

There are two terrible default settings that I changed for TimeMachine backups. Apple Photos has the option to store optimized photos locally, and iCloud has the option to optimize Mac storage. Those are bad default options.

Both of those options save local drive space by storing infrequently accessed data in iCloud only and only pulling it down to your Mac when needed. But the problem is TimeMachine won’t be able to back that up. I think it’s too risky to have only one copy of my data on iCloud synced to my computer, so I turned both of those settings off to allow TimeMachine to do full backups.

Alternative: I have not run close to running out of space, but if I do I’ll probably set up a virtual Mac, with iCloud and Photos optimization off, in a Proxmox VM with a large drive for the purpose of backups while our laptops have the optimized Apple Photos and iCloud settings.

Security

  • OSX has built-in antimalware (similar to Windows Defender).
  • I was impressed that when doing things on the command line the OSX asks if I want to give the Terminal access to certain areas, especially my user folder.
  • Overall, OSX seems to ask for permissions when an app wants to access data in a location it wouldn’t usually use.
  • Local SSD storage is AES encrypted
  • With Advanced Data Protection, most iCloud data is encrypted in such a way that not even Apple can access it. The private keys to access your data are stored on your devices, and you can set up a recovery code and recovery contacts in the rare event that you lose all your devices.
  • You can associate YubiKeys with your iCloud account, requiring MFA for specific actions such as adding a device to your account.
  • The fingerprint reader key is a fast way to unlock your screen. Also press it to instantly lock the screen (a feature that was missing in my last Mac).
  • Passkeys are a great alternative to MFA.

Optimizing MacBook Battery Longevity

The Macbook is intelligent about maintaining the lifespan of the battery. Lithium Ion batteries tend to last the longest when they are charged part-way, and wear out faster the closer they are to full or empty. So when you plug your Macbook in for the night, OSX charges it up to 80% and then holds off until right before you need it (it learns). For me, it looks like the OSX tries to have it fully charged by 6am.

The battery is rated for 1,000 charge cycles. There’s no reason to wait until a complete discharge to charge it up…that’s not how battery cycles work. Running the battery down to 75% and then charging it to 100% is equivalent to a quarter cycle.

I’ve found that a 50% charge is more than enough to get me through a day, so I won’t even put it on the charger for the night unless I see the battery is below 70%.

What about Linux?

Around 20% of visitors to this site are coming from Linux. I know some of you are disappointed I didn’t go to Linux, and so am I. I’ve tried Ubuntu/Kubuntu, Fedora, Fedora KDE, and Debian. I just run into little issues from time to time like the audio driver skips or crashes or the network card randomly quits working after sleep or docking/undocking. Some of the software hasn’t kept up. Kmail crashed on me several times and Evolution was very slow. It took several days to do a full IMAP sync crashing several times and was so sluggish it was practically unusable. Compare that to Apple Mail (even Outlook is not that slow). Systemd sometimes hangs on shutdown for several minutes waiting on something. Despite being my platform of choice for servers, I think Linux is a little less stable (at least on the hardware I have) than it should be to use as a primary driver. But I do use Linux inside Parallels.

Overall, I think Macbooks provide a good computing platform. The strongest feature of the Apple ecosystem is the ability to use an integrated cloud provider while still keeping most of your data end-to-end encrypted.

RISC architecture is going to change everything.

eof

The post Apple Silicon MacBooks Review appeared first on b3n.org.

]]>
https://b3n.org/apple-silicon-macbooks-review/feed/ 3 114911
Faster Cloudflare Worker for Plausible Analytics | ~Zaraz https://b3n.org/cloudflare-worker-for-plausible/ https://b3n.org/cloudflare-worker-for-plausible/#comments Sat, 18 Mar 2023 17:13:45 +0000 https://b3n.org/?p=115138 Shave 100ms off the Plausible Analytics POST response, making it as fast as Cloudflare Zaraz GA4 I moved over to Plausible Analytics; I had been using JetPack Stats. But I’ve never been happy with the (1) extra 3rd party DNS lookup, (2) large JavaScript payload, (3) cookies, and (4) sending data to a 3rd party. ... Read more

The post Faster Cloudflare Worker for Plausible Analytics | ~Zaraz appeared first on b3n.org.

]]>
Shave 100ms off the Plausible Analytics POST response, making it as fast as Cloudflare Zaraz GA4

I moved over to Plausible Analytics; I had been using JetPack Stats. But I’ve never been happy with the (1) extra 3rd party DNS lookup, (2) large JavaScript payload, (3) cookies, and (4) sending data to a 3rd party.

Cloudflare has an excellent solution for Google Analytics GA4 by proxying requests through Zaraz. I tried it. It is fast and easy to implement. But GA4 still sets cookies and sends data to a 3rd party, which didn’t quite meet my goals.

I figure I could use Zaraz with self-hosted Plausible Analytics. Plausible is a web analytics tool with privacy in mind. I never complied with the EU cookie consent banner anyway (I’m not in the EU), but I wish sites all would all go cookie-free. Not because I don’t like cookies. 🍪 I do. But so they can get rid of that cookie consnet banner!

So, Plausible doesn’t support Cloudflare Zaraz. But there is a workaround to proxy Plausible through Cloudflare using Cloudflare Workers from the Plausible documentation. This saves an extra DNS lookup. But the problem is it’s still waiting on the origin server. I tested ~140ms from Eastern US.

Plausible Cloudflare Worker

Now, it’s not “render blocking” so this isn’t horrible, but it’s the slowest item on my site. Visitors from US West get it fast, but it’s 100-200ms from US East and probably around 500-600ms from outside North America.

But, we can make the Plausible Worker as fast as the Zaraz GA4 solution.

The problem is the communication goes like this:

async function postData(event) {
    const request = new Request(event.request);
    request.headers.delete('cookie');
    return await fetch("https://plausible.io/api/event", request);
}
  1. 👩‍💻 Client → CF Worker: Here’s the POST data. Waiting for a response.
  2. ☁️ CF Worker → Origin: Here’s the POST data. Waiting for a response.
  3. 💾 Origin → CF Worker: “202: ok”
  4. ☁️ CF Worker → Client: “202: ok” (143ms response time)

But we only care about getting the POST data; the response doesn’t matter. We can change the worker to send a 202 before it knows the response from the origin server.

I thought I’d rewrite the function ask ChatGPT to do it for me.

(gave it original worker script)

ChatGPT made one trivial mistake; the return status should be 202.

async function postData(event) {
    const request = new Request(event.request);
    request.headers.delete('cookie');

    const response = new Response('OK', { status: 202 });

    event.waitUntil(async function () {
        await fetch("https://plausible.io/api/event", request);
    }());

    return response;
}
  1. 👩‍💻 Client → CF Worker: Here’s the POST data. Waiting for a response.
  2. ☁️ CF Worker → Client: “202: ok” (9ms response time)
  3. ☁️ CF Worker → Origin: Here’s the POST data. Waiting for a response.
  4. 💾 Origin → CF Worker: “202: ok”
Asynchronous Response Plausible Cloudflare Worker

Essentially, a Cloudflare Edge location can respond in 9ms instead of waiting 143ms to go all the way to Sandpoint and hit the origin server.

Here’s the complete modified Cloudflare Worker Script:

const ScriptName = '/js/script.js';
const Endpoint = '/api/event';

const ScriptWithoutExtension = ScriptName.replace('.js', '')

addEventListener('fetch', event => {
    event.passThroughOnException();
    event.respondWith(handleRequest(event));
})

async function handleRequest(event) {
  const pathname = new URL(event.request.url).pathname
  const [baseUri, ...extensions] = pathname.split('.')

  if (baseUri.endsWith(ScriptWithoutExtension)) {
      return getScript(event, extensions)
  } else if (pathname.endsWith(Endpoint)) {
      return postData(event)
  }
  return new Response(null, { status: 404 })
}

async function getScript(event, extensions) {
    let response = await caches.default.match(event.request);
    if (!response) {
        response = await fetch("https://plausible.io/js/plausible." + extensions.join("."));
        event.waitUntil(caches.default.put(event.request, response.clone()));
    }
    return response;
}

async function postData(event) {
    const request = new Request(event.request);
    request.headers.delete('cookie');

    const response = new Response('OK', { status: 202 });

    event.waitUntil(async function () {
        await fetch("https://plausible.io/api/event", request);
    }());

    return response;
}

I can’t think of any downside to this. If the POST is not successful, the client isn’t going to care anyway.

A millisecond is worth a fortune. — Eric Kirzner

The post Faster Cloudflare Worker for Plausible Analytics | ~Zaraz appeared first on b3n.org.

]]>
https://b3n.org/cloudflare-worker-for-plausible/feed/ 2 115138