
Clean Code Best Practices
June 16, 2021
How To Setup Git-ftp For Windows
June 29, 2021
Hello everyone, today I’ve decided to post a curated list of some of the more interesting/impactful advances/stories in tech (over the last month or two). I’m going to provide a brief along with links to the articles so you can check out the full source and make sure that this isn’t fake news. Leave a comment below if you like this kind of content, if not, I will continue with more traditional single-topic focused blog posts going forward. The below posts are not in any particular order.
Bitcoin Power Consumption Jumped 66-Fold Since 2015, Citi Says
Key points:
- Increased power consumption will result in higher scrutiny, especially in relation to climate change.
- The bitcoin network uses around 143 terawatt-hours annually (for scale, this is greater than the entire energy output of Argentina).
- Bitcoin mining has exploded in China where fossil fuels are still the main source of energy generation.
Cloudflare Pages is now Generally Available
Key points:
- Cloudflare have released a DevOps tool similar to Jenkins.
- In essence you link your Cloudflare Pages to GitHub and when you push your site gets updated on Cloudflare.
- With Cloudflare pages you will get all their security, and hosting in 100 countries.
Linux repo hits 1,000,000 commits
Key points:
- Nothing really, I just think this is cool.
GPT-3’s free alternative GPT-Neo is something to be excited about
Key points:
- The “Transformer” family of models allow for infinite scalability by just increasing training data and computing power.
- Most of the GPT-3 language model’s code is open source, but their data is not.
- The Eleuther team created an open source data set called The Pile. The Pile contains 825GB designed to train language models.
- GPT-Neo outperforms GPT-3 models with the same amount of training data in top benchmarks
There is a huge amount of data out there and finding interesting articles in and among the listicles can be an arduous task. For my next blog post I’ll dive deeper into where I source my news, and share more interesting articles.