trying new software
I haven't been motivated to write anything lately, but I guess I can give an update on what software I am currently trying or going to try:
- neovim, to replace vim. I chose it because the codebase and development is supposed to be cleaner and less dependent on one person pulling in patches. liking it so far; it also has a few small features I've been looking for in vim, namely the ability to resize panes using mouse. this may have already been possible in vim but it has never worked for me.
- neomutt, saw it when I was looking up mutt and chose it because it offers some plugins built-in. once I configure it I may replace seamonkey with that and a different internet browser. first issue I see with mutt/neomutt is lack of mouse support, but I'll still play with it for a while.
- sway (wayland compositor). I haven't really had a chance to try this yet but I want to see how well wayland works, and I may switch to it from X.
- ConnMan, to replace NetworkManager. it's definitely light and apparently it supports USB tethering and bluetooth PAN, so I'll give it a shot.
I also downloaded some ISOs to play with in qemu:
- Void Linux -- haven't run it yet
- TempleOS -- tried it, it works but the 100% sound volume scared me
- ReactOS -- it won't boot properly; I'll have to look at the error again
- Gentoo -- I used this briefly years ago but haven't accustomed myself to it at all. I want to install it with musl and busybox, possibly also a hardened profile.
- Plan 9 -- haven't run it yet
aside from that, I had a very spiritual dream last night so I have decided to keep a dream/meditation log now. I used to keep a dream log years ago but stopped due to lack of interest. hopefully I keep my interest this time, because I feel I may be able to learn some things from my experiences. if I make any notable discoveries I may write about them here.
"Learning how to learn"
here's a verbatim essay I wrote in response to the common misconceptions held by many Tor users and
You probably see advice published everywhere – guides and tutorials and lessons. People who claim to have your best interests at heart. Many people do, but at the same time many people don't. And even the people who do can make mistakes. If you don't do so already, you need to learn how to think like a scientist: always sceptical, but never driven by fear. Being able to think for yourself, weighing all information you come across for validity, is a necessary asset that people seem to overlook in their quest toward activism.
Know what you're using
You installed Tor because it's nice and secure. Do you know exactly how it works though? Do you know what happens if you use it wrong?
I see these technologies get thrown around all the time in privacy-related conversation: Tor, VPN, PGP, Tails. And for the aspiring hackers, Kali comes up quite often. All these things are fine, but people discover them more out of haphazard curiosity than anything. They know what these things are, they know that others tell them to use these things, but they don't often know why people talk about them so much.
Read up about these subjects. You don't have to do an entire research debacle on them, but you should be able to summarise to yourself what everything does and why it works. Wikipedia is a great resource; it's concise and you can always branch out to learn more if you're interested. Once you know exactly what these technologies were made for, you will be able to utilise them intelligently.
I can summarise up a few common misconceptions: Tor's primary purpose is to provide a secure proxy to the Web, while I2P's is to provide an anonymous network that replaces the Web. A commercial VPN is for privacy, while Tor is for anonymity (this article explains their differences nicely).
Tails and Kali are simply customised Linux distributions (these two happen to be Debian-based), meaning that I could take Arch Linux (or your favourite distro) and replicate the functionality of either, after I take the time to configure it to my liking. The reason people use Tails, Whonix, or Kali is because they trust the developers to make a system that meets their needs, and they are incapable or unwilling to configure their own system. Ultimately, the choice of operating system is up to you; there is no "best" operating system, so try various systems out until you find your match.
Don't believe everything you see. Professionals make mistakes, amateurs make mistakes, you and I make mistakes. Even with these guides, you should use your own judgment and filter out what seems logical. I wrote this in hopes that I was making sense, in hopes that my logic was sound and worth reading. But, I can always miss important things, and I'm here to learn just as everyone else is. After reading anything, you should cross-reference with other information if you're unsure about certain points, and ultimately you should test the information against your own knowledge to see if it fits in with what you believe.
Knowledge evolves; people go to sleep believing in one cause, only to wake up believing in something else. The best any of us can do is follow what our heart says, keep our wits about us, and hope that our current beliefs will lead us on a better path.
A good leader shows power by being motivated and experienced, not by being deceptive and forceful. You gain followers by relating with them, by sharing common core values, and by educating them. People should follow you because it is their decision to do so, because they actually wish to listen to you. If someone leaves you, do not try to pull them back; it only means that they felt your group was not the best fit in terms of ideals, goals, or methods. If everyone leaves you, you may want to ask why and adjust your actions based on the response. Leaders are people too, and they're bound to make mistakes, but a good leader (and a well-formed group) can recover from these mistakes quickly and easily.
With that said, leadership is bound to change. It's natural, it's seamless (in a mature group, people just know who's "in charge" simply by the way they present themselves in the group), and it fosters new ideas and a different way of approaching issues. When starting a group, don't worry about who's head; that will come naturally and by consensus. Just focus on what you, as a group, need to do, and take everyone's opinions and suggestions into account. There should be an equal level of trust placed on all group members, and if the group simply cannot trust someone then it should make a decision on whether removing the person from the group is the best move. Feelings may be hurt, but a good group is resilient to this sort of friction. The group will carry on its business and wait for the conflict to pass.
Most importantly, never trust someone solely because they are a figurehead. There is a strong difference between a figurehead and a true leader, and more often than not, people will grow to oppose a figurehead once they begin learning the truth about him. A figurehead is usually defaulted into power – either by status or by money or heritage. In contrast, a leader starts out as an equal and is brought into high esteem by his peers. Both leaders and figureheads are influential, but figureheads will hardly have your best interests at heart. Figureheads will do what they need to retain power, and they will trick others into believing whatever they have to say. They rely on the power of emotion in order to convince others that certain views are correct. And once they have a following, they can dispatch whatever lies they wish, knowing that their followers will eagerly eat it up.
If you think this part sounds a bit overreactionary, I apologise, but I have seen this cult-like pattern in quite a few groups, namely the social justice movement. Everyone in the movement is bound together by a common emotional appeal: they are all minorities (real or imagined) and they seek safety in their circle by rejecting outsiders and playing the role of a victim. This is a toxic, spiraling attitude that only strengthens the power of the group, and the worst part is, people who seek acceptance see this movement and think they are doing the "right thing" by promoting minorities. So, they join in, finally feeling a sense of acceptance, and they learn from others in the movement that the patriarchy is the cause of all suffering in the world. A logical person would dismiss this claim and assign the blame to real issues (sexism and racism are issues, but not in the ways that the social justice movement claims), but once you have given someone hope and reassurance, you can make them believe whatever you wish.
are passwords the right solution?
[I have lacked motivation to write anything lately, but this week marks the beginning of my spring college semester, so I figured I'd force myself back into a schedule.]
a month or two ago I read an article by Alec Muffett, where he attempted to defend password authentication as possibly the only viable online security solution. I even sent him an E-mail asking him to reconsider some of his thoughts toward passwords:
[...] I came across your opinion on password use  and I have to disagree with you. My issue with passwords (as they are currently widely implemented) is that the password has to be sent to the server verbatim, and it is up to the server to safely handle this password (hashing it and making sure memory where passwords are handled is promptly cleared, in case of vulnerabilities in the server that allow reading memory), and it is up to both the user and the server to initiate a secure connection so that password eavesdropping is infeasible. I favour PKI, challenge authentication, and other mechanisms which do not require any transmission of a private key or passphrase over plaintext to the server. This places the burden of security on the user and on the PKI/challenge protocol itself, which I believe to be much safer than having to place the burden on all of the user, server, and transmission medium. Please consider these points and perhaps revise your decision on claiming that passwords as they are used today are a sound security mechanism.
so already, that explains half my stance on passwords off the bat. I do favour PKI -- I use elliptic curve keys for all SSH connections and disable password authentication, and I would use similar authentication for websites if it was an option. to me, it makes more sense to have a file or files tied to each device I own, and should that device be compromised, I can simply log in from another device and revoke the now-insecure key. this allows for finer-grained access control than I would have with passwords: for example, right now I would have to reset my password if I logged in even once on an untrusted public computer. granted, I would have to use a temporary key for a public computer, stored on something such as a USB drive, but at least I wouldn't have to change the key on each device I own. it's a different story if you have a remarkable memory and can memorise random passwords with ease, but a lot of people including myself cannot or will not trust our memory to this task.
I have more faith in TFA than I do in plain passwords. to save myself from reiterating ideas I have already typed (again) I will cite my response to the Hidden Answers question
What and how much credentials do you save in KeePassX? (please note this link is only accessible over Tor or this I2P link):
[W]e should look more into TFA. KeePass supports it to some extent (it combines a password, something you know - with a keyfile, something you have like a USB drive). It allows you a little more time to react to breached security because even if the attacker has one piece of the TFA, it will take him some time to get the other piece and actually be able to utilise that information.
Weigh the differences:
- Store the passwords in your head. Pros: you can't hack a brain (as far as I know). Cons: unless you have impeccable memory, you will likely formulate smaller, weaker passwords because that's all you can remember. Also, you may choose to reuse passwords more often, which is also unsafe.
- Store the passwords on paper. Pros: you can't hack a piece of paper. Also, if you don't label the passwords (you use something like PasswordCard.org) you can be a little safer in case you lose it / someone sees it. Cons: someone can easily steal that piece of paper, and even if you use the PasswordCard, you have significantly narrowed the possible passwords for the attacker. So, if you lose that card, you're going to want to rush to change all your passwords.
- Store the passwords in a password manager. Pros: Password managers organise your passwords and they require you to only know the master password, leaving you with less to remember. Good managers can also generate strong random passwords for you. Cons: once someone gets the master password, your passwords are all in the open and you're in big trouble unless you set up TFA for all your accounts.
TFA/multi-factor authentication is a definite improvement over single-factor authentication, and only recently have I decided to add TFA to any account I could find an option to do so. I also use a combination of storing frequently-used passwords in my head and storing the rest in a password manager, which are encrypted and synchronised to Google Drive and, in effect, to my phone. that way, I have a copy wherever I go and I am as secure as possible within the confines of password management.
I still believe authentication should be given more thought; there are still plenty of organisations that have very poor regard for security and impose artificial limits on passwords out of cost/laziness:
- limits on password length or character composition,
- storing passwords in the remote database as plaintext,
- sending back a password over an insecure channel as
confirmationof a password reset, and
- requiring a user to add
security questionsto one's account (which is a huge fucking oxymoron; there's nothing secure about security questions).
if all websites agreed that these are poor practices, that would eliminate many of the issues with passwords right away. combine that with mandatory use of a secure channel such as TLS (which many sites thankfully do now), use of server-side password hashing such as bcrypt or Argon2, and user education on proper password formulation (no password reuse, no dictionary words, et cetera) and sites would be sitting pretty while not compromising compatibility with the current security ecosystem. users should know that password managers are as necessary as an Internet browser at this point, and that there are many user-friendly solutions to this already: many Web browsers even have built-in password saving and synchronisation across devices, but of course there are also solutions such as KeePass and LastPass. in fact, these points I just made are in line with Alec's article I linked at the beginning, so we're in agreement there.
but what if we want to take a step further and opt for a more secure (but less orthodox) solution? let's look at the list of advantages Alec gave favouring passwords, and compare this to something like PKI:
- passwords are easy to deploy [and so is a PKI solution, at the cost of a temporary stage of switching from passwords to PKI. if done correctly, PKI can be abstracted to the end-user so that it is actually easier to use than passwords, and users can just click
generate loginto create a random file and save it to an internal (optionally synched) database on-the-fly.]
- passwords are easy to manage [... see above for why PKI would be easy to manage without the user being concerned with implementation.]
- passwords don’t require identity linkage between silos – so your Google username can be different from your Skype username, can be different from your SecretFetishPornSite.com username [... PKI doesn't require this either; simply generate new keys for each site you use.]
- passwords are scalable – you can use as many different ones as you like [... same for PKI.]
- passwords can be varied between silos so that loss of one does not impact the others [... see above.]
- passwords don’t (necessarily) expire [... still same for PKI. advanced users could optionally be allowed to set expiries for keys (just like X.509 allows), and users could at any time revoke a key from a website if it's compromised.]
- passwords are the purest form of authentication via ‘something you know’, and thus ideal for the network or “cyber” environment. [now, this is an actual argument for passwords. PKI is along the lines of
something you have, but for the majority of security-conscious users, so are passwords. passwords are stored in a database or on a piece of paper (something we have) unless we have remarkable memory (more power to you) or we reuse passwords (which is wrong).]
- you don’t need to pay an intermediary or third-party a surcharge just to get a new password, nor to maintain an old one [... same for PKI.]
aside from the fact that PKI is
something you have rather than
something you know, it maintains many of the properties of passwords and has the added benefit of being secure by default: secret keys are not transmitted over the wire, and server database compromises would be fruitless since all keys stored are already public. end result, both users and server administrators have less to think about and worry about. there are still perfectly valid uses for passwords, but I would like for people not to fool themselves into thinking passwords are the universal solution. passwords should strictly be something you know rather than something you stick in a database, and you should only have to memorise a handful of passwords, instead of having to remember one password per mail account, social network account, bank account, forum account, game account, and whatever other accounts you have. passwords should be used in a local context: useful to decrypt your PKI database locally or to unlock your computer/phone quickly.
testing patches made to bashblog script
since I'm using a linux desktop now in place of my windows 8.1 laptop, I can now sanely use linux commands such as gpg and rsync (and the bashblog script itself) to locally sign my posts and transmit them to my server with minimum effort. the original bashblog script relies heavily on GNUisms especially in the
date command, therefore requiring a little effort to adapt to alpine linux (which uses busybox and not coreutils) and the inability to use some of these odd GNU requirements. I wish people would pay attention to compatibility; everyone seems to focus only on GNU and BSD and completely forgets about POSIX standards and requirements.
anyway, this post should be signed (click
PGP signature near the top of this post to get a markdown version of the article along with the appended PGP signature). I will manually sign previous blog posts as well, for completeness.
busy this week, but I'm using Alpine Linux as my main operating system now, because I've grown tired of using Windows 8.1 on my laptop. I will get that set up with PGP and use it to sign my future (and possibly my past) blog posts.
[two blog posts in one week, but this was on my mind today so I figured I'd publish it now]
another thing I wrote back in 2016:
I use self-signed certificates for all TLS-enabled servers I set up. I know there are freely-available certificates being handed out by authorities, and I know that Let's Encrypt exists. I do not refrain from requesting a CA-signed certificate because I am too lazy to do so; if I wanted, I would ready a CSR right away and get that taken care of. There is only one key reason I abstain from getting my certificate signed by a root signing authority: they are too centralised.
Let's take it from the top: when you install a new operating system, be it Windows or Linux or Mac or a mobile device firmware, the system will most likely come preloaded with root certificate trusts. Who decides to trust these? You haven't hand-picked them, and most people don't even bother looking at the list of authorities their system trusts. Not to mention, most people don't even recognise half the companies and organisations that provide certificates. So, you're at the whim of others – all you can do is hope they have your best interests at heart and that they aren't screwing you over, accidentally or on purpose. In reality, quite a few authorities are lazy with checking if a CSR is valid or just a spoof attempt. As long as they profit, they don't really have your security at heart.
And of course, high-profile companies can and do get hacked. This holds true for certificate authorities as well. And the more widely used a CA is, the more susceptible to attack it will be. Also, while this does not apply to everyone, there are workplace, school, and governmental filters that are put in place to censor certain sites deemed to be unacceptable for access, as well as to detect obscene or unsafe (malware) content. In order for a filter to be effective for these purposes, it must intercept all your traffic; HTTP is straightforward, but HTTPS requires the firewall to strip, analyse, and re-encrypt the content that passes between you and the webserver. In doing so, it relies on each computer in the network having its own certificate installed and accepted. End users may not be fully aware that this is going on, and they may believe they are completely safe because their browser is not warning them at all. This may be acceptable if everyone understands what this filter does and if the filter is configured correctly, but that isn't always the case. My high school's filter made no distinction between valid and invalid certificates, so I could unknowingly be put at risk for accessing an unsafe site that appears to my browser to be safe. Plus, if people wanted to use their own devices on the school network, they were required to install the root certificate even though they were not made aware of the consequences this would have.
So that's one issue: most people don't even explicitly trust the certificates installed on their system. Even if you are aware of the root certificates you trust, you cannot fully trust that they will remain uncompromised and trustworthy in the future. Let's Encrypt was proposed as a solution to the skimpy checks most authorities put their customers through; Let's Encrypt uses a cryptographic proof to verify that you are the owner of your server. While this is a great step in the right direction, it does not change the fact that Let's Encrypt is still a centralised certificate authority, and thus is fallible to the same issues as every other authority.
In contrast to the common SSL/TLS methods of verification, SSH actively encourages you to accept key fingerprints on a per-server basis. There is no metadata attached to this (e.g.this key is valid on domain.xyz and www.domain.xyz) that can be changed to fool you into accepting the certificate – there's just an arbitrary fingerprint that tells the complete truth about who the server is, straight to the point and no bullshit. PGP operates in much the same way, except that there is certain metadata on keys that describe the owner and their E-mail address. With both of these, the key owner can disclose the fingerprint via whatever channels he deems secure, meaning that an adversary has a harder time compromising this information. TLS certificates have fingerprints as well, but most client software obscures this in favour of a less helpfulthis may be insecurewarning.
A compromise would be to provide a simple-to-use and simple-to-understand web of trust system that combines the benefit oflazyand indirect trusting (I trust my friend's judgment, therefore I would like to accept all the certificates that he does) with the benefit of decentralisation (points of weakness are smaller and more dispersed, making damage caused by compromise much more tolerable than if a large CA was compromised) and direct trusting (I know that this site is what it says it is, similar to theadd exceptionfunction already available in most client software). Right now we do not have the luxury of a well-thought trust system that is straightforward, unobtrusive, and secure. So for now, I will opt only for thesecureoption since it is the most important in this scenario. After all, if you wantstraightforward and unobtrusivewithoutsecure, why don't you just use HTTP?
to put my money where my mouth was, I did create my own CA to use with all my websites, and that worked relatively well for about a year. but lately I have found that some XMPP servers will not federate with servers that have untrusted certificates, so it was time for me to make a decision: compatibility or security. I bit the bullet and now I'm using Let's Encrypt certificates on all my websites and servers, forced TLS on websites/E-mail/XMPP (and soon IRC), and overall I guess it's a good move even though I'm upset with the CA system. hopefully one day we have a better system in place to address the flaws of the CA system.
decentralisation - let's start with messaging
in lieu of the recent net neutrality talk, a mesh networking project started out of 4chan /g/ in hopes of building a better, open internet. I have always been interested in decentralisation, distribution, and peer-to-peer mesh, so I set up a wiki to aid in collaborationm, to showcase helpful documentation for new users, and to connect ourselves with other meshnet projects around the globe such as sudomesh and /r/darknetplan. best of luck to /g/ and to the other groups in achieving this goal for an improved internet.
I currently cannot contribute to the physical network itself, as I do not have the appropriate hardware, nor am I in any close proximity to anyone who would be interested in forming a local mesh with me. but I can still think about what could be the best software and protocols to use as replacements for current proprietary, trust-based systems. one of those things is messaging.
currently we have instant messaging such as Facebook Messenger, XMPP, and SMS; group messaging such as IRC, Slack, and Discord; voice chat such as telephony, Mumble, TeamSpeak, Skype; forums, Reddit, imageboards, NNTP; and E-mail. and of those, I personally want people to contact me over XMPP, E-mail, IRC, or SMS. that means I have four addresses I can offer someone and I have to rely on the assumption that the other person uses at least one of those protocols. already, this is a hassle, right? all these protocols and products offer certain solutions to issues in other protocols, so we're spread thin, and either we have to use them all or we have to cut ties with the people we want to contact. on top of that, we have several contact lists to keep up with, and overall it's just one big complication.
first off, why do we have so many protocols? what do they perform differently? why do people stick with them? I'll try to explain what sets each platform apart from the rest so we can gather an idea of what people want.
for messaging: SMS, XMPP, social networks (Facebook/Hangouts/AIM/Yahoo!), Discord, matrix.org, IRC, and other services such as WhatsApp are prevalent. SMS has origins from the phone network as a result of the rise of mobile phones, and now that many people have a phone number, it's no surprise why SMS is also in common use: everyone has it and it's practically free. social network and proprietary IM services have been around for ages, and people might stick with them because they've had accounts since the initial craze and because with some of them they can maintain a relative level of anonymity, since they don't have to use their phone number unlike SMS. or, they might stick with them because almost everyone today has a Facebook or Google account for social networking or mail, so as with SMS,
everyone has it and it's practically free.
with IRC we see the introduction of easy-to-join group chats, as well as the idea of ephemeral identities. IRC is a simple yet robust protocol that allowed anyone to join with very little barrier to entry, choose whatever nick they want, and either join channels or talk in private queries with friends. because of its simplicity, it has achieved widespread adoption by the FOSS community, by governments and businesses, and by hobby groups. it has also been used as the backbone for several popular services such as Twitch and Pesterchum.
XMPP, Discord, and matrix.org all come about at different times, but they all have a common theme: they allow easier use on multiple devices. XMPP and Matrix also have protocols for end-to-end encryption, and Discord has the added benefit of voice (and now video) chat. they are essentially the
next level up of IRC, although they all cause as many issues as they try to fix. XMPP is poorly implemented, and because it uses (nonconformant) XML for its stream protocol, we see few people willing to actually create their own clients (as opposed to something like IRC where I can make a rudimentary client in my sleep if I wanted). there are so many XMPP extensions (see the XEP) and because of this, each client is more than likely only going to support their own portion of the available XEPs, creating a gradual divide between clients and essentially devolving the entire protocol to its bare basics, removing any perceived benefit it has over other simpler protocols such as IRC. Discord is proprietary, there are no plans to allow for privately-hosted infrastructure, there are no plans to open the Discord protocol, and apparently there are no plans even to fix most of the bugs regarding usability as well (at least, they haven't replied to my E-mails with anything hopeful yet). and then there's the new player, Matrix, which aims to become a federated (like XMPP) protocol with group chat (like IRC), message history (like Slack), and encryption (with double-ratcheting). too bad Matrix's implementations are slow and buggy, the specification relies on HTTPS and JSON (arguably not the best choice if you're signing and encrypting messages, or if you're sending any binary data because it all gets base64-encoded), and the whole air of it has a hint of unprofessionalism to me. they seem to value UX features such as in their official client, Riot, over security and performance fixes.
why program efficiency [and usability] matters
in 2016 I wrote a small rant about the current downward trend of software and web development, entitled
Why program efficiency matters:
Computer hardware has become faster, more efficient, and more powerful in recent years, which means programmers are not constrained as much by memory and CPU cycles. But does that mean programmers should just give up trying to make their code more efficient?
It doesn't matter if our programs are bigger! : I don't know about you, but I enjoy extra disk space for movies and music. Just because disk space is affordable doesn't mean programmers can excuse themselves for adding unnecessary fluff to their projects.
It doesn't matter if our code takes up more memory! : Multitasking computers have been a thing for a while now. With that said, I would like my computer to actually multitask. I shouldn't have to constantly worry about how many programs I have running in the background and how much memory they consume. Also, there are plenty of older systems running in corporate and educational environments that simply cannot handle modern (and memory-hungry) software without constantly locking up.
It doesn't matter if our code is slower! : Speed is always a value to strive for. Any sensible person would choose "faster" if presented with two programs that perform the exact same tasks but at different speeds.
That said, if you have to sacrifice any of the above for security, please do so. Otherwise, if there is any way to make a program smaller or faster or more efficient, without changing the core functionality of the program, then take the time to improve in those aspects. Laziness is no excuse for a slow, fat program. At the same time, don't let yourself be consumed by trying to make your code perform better before you have even finished writing the program.
this applies to desktop, server, mobile, and Web software all alike:
- desktop operating systems are gradually becoming more bloated and new features are half-baked (such as later versions of Windows), people using Electron to develop fucking everything now (and I'll talk about Electron on its own in a bit).
- server software dependent on weightier languages such as node and python (matrix.org for instance). this is very problematic because servers have much more stressful demands and none of us wish to spend resources we can better use to serve end-users. every bit of RAM and every CPU cycle counts under high load.
- mobile phones are mad useful for on-the-go matters (I'll have a blog post later, describing a smartphone's exact use compared to laptops and desktops) but they're becoming more powerful than most laptops now. many apps are Web-centric and it's quite possible that a lot of the mobile ecosystem is unoptimised: not just the apps but also the operating system and the virtual environments under which apps are designed to run.
I'm making this post today because someone sent me a link to a post Casper Beyer made regarding Electron, entitled
Electron is Cancer. I'll quote some notable passages from the post:
Well, it works fine on my machine, and I only have 32 gigabytes of ram.- Silicon Valley Developer, 2017
If that’s you, well then that’s good for you, but just because something performswell enoughon your machine doesn’t mean there are not any performance problems. You are not your end-users, and you if you are a developer most likely do not run average hardware.
^ I made this point in my 2016 rant -- people have different hardware and developers need to keep this in mind, lest they want their programs only to run on a small set of machines in the world.
Electron is so great, we did not have to hire new people we can just use your web designers that we already have in-house and it is so easy!- Someone Actually Said That
Okay, sure having a plumber cut out a square wheel from a plank is also a lot easier to do than having a woodworker carve a perfectly round wooden wheel, but it is gonna be one hell of a bumpy ride, and square wheels are actually fine, right?
^ I've seen this a lot too; people have derived from
do one thing and do it right philosophy, both in software and in expertise (although on the expertise side of things, it helps to be well-versed in several areas so you're more valuable in a job, but usually those areas are close enough together that they complement each other. you wouldn't want that plumber performing heart surgery on you, would you?)
if you have time, read Beyer's full post because it covers a lot of good points about Electron and about modern software developers as a whole. it's a rarity to find a decent dev nowadays who cares about efficiency, usability, and accessibility; and that certainly affects where technology is going as a whole. as we depend more on technology in our everyday lives (mobile, IoT, business) there is really no room for sloppy code to run in banks, hospitals, vehicles, and other mission-critical devices.
Tags: software, programming