2020-06-26 19:22:04, In: Other, Rants
I will start this post with a ?newspaper? cartoon from 1890s, criticizing the electrification of country in the early days of electricity, when it was known and more or less used, but most safety measures were non-existent. Unfortunatly, I'm not exactly sure about the source of the cartoon, some sources label it as "An unrestrained demon", and contrary to the source of this information it is probably earlier than 1900 looking from the shape of the pole, bulb and lantern. About the picture, unfortunately, it shows the true, sad state of electricity in the end of 19th century, when it was handled carelessly, without protections and only for maximizing profits. So don't take it entirely as false. Let's see how the electricity looked like in its early stages and why it was quite dangerous energy to use.
One more thing: I will try to write this in a simple words, sometimes neglecting engineering details, but I think that it should be understood by any people familiar with technology, not only these who like messing with it.
So what was it like in 1890s? Let's see...
The electric fuse is a simple but useful part of circuit. A piece of carefully chosen wire, shielded by arc-extinguishing sand or stretched between ends, efficiently keeps the worst short-circuit scenarios away. If any device shorts causing excessive current flow, the fuse sacrifices its wire to protect the installation from overloading, overheating and further destruction of power supply or transformer. Excessive current flow heats the damaged part up excessively, so the fuse protects against fire and that's why in many fuse sockets there is a note to replace only with the same rating and type of fuse. Although fuses became known from protecting telegraph lines against thunderstorms in mid-1800s, they became applied in electricity networks quite lazily. Why? To protect against gigavolts and kiloamperes from a lightning you need a piece of wire... in fact most wires will work and burn successfully. However, when it comes to selecting the fuse to a specific power consumption, the thing was rare, especially if we consider the start-up peak current, today addressed with slow-blow fuses. Most electrical devices need a significantly larger current for a fraction of second to start. The fuses these times were not properly chosen for application and in fact mostly were just a piece of commonly found flat wire which blown with thunder's energy, but their exact characteristics were not known. So in some cases, the voltage drop was too high and these fuses prevented some devices from prompt operation or even failed when the initial higher current flows for a fraction of second. It is known that Mr Nikola Tesla miscalculated one of his powerful autotransformers and the installation was not efficiently fused. The damage reached the generator, entirely burning its coils (described in Słowiński P. - "Władca Piorunów - Nikola Tesla i jego genialne wynalazki"). The truly working fuse, tuned for application, became used in early 1900s and finally got the form which we all know as it was used until 1990s - a "Zed" fuse - ceramic "shell" mounted in threaded casing.
Modern electric sockets are safe and handy. Put the plug in, use the device, pull the plug out, all parts which are "live" are secured from user's fingers. Some sockets are better, some are worse. A really well engineered in my opinion are British sockets and plugs, the only thing they forgot is what if someone steps on the disconnected plug on the floor. European sockets are a result of standards common denominators, which can be seen in Italy, where in older houses still a "light" and "force" separate sockets can be found. Sometimes additional standards tried to be introduced but they lived very shortly - in Poland for example, quality of sockets was so poor that for high-current appliances a Greek-type high-current sockets have been introduced in 1970s/80s, but that failed, like Russian ?48V? DC "scientific" electricity network. However, when the electricity appeared, the devices were supplied with screw terminals. The electricity was supplied also with screw terminals and a switch to make operation of these safer. And user must remember to turn the switch off before touching the terminals. Switch, being a hinged conductor with piece of insulating wood, was not very safe too. With progress, a very popular socket standard emerged and became used, and in fact it was even worse than screw terminals as switches were usually not installed with it. I write about Edison screw (picture courtesy of Wikimedia Commons, this is from 1893).
What is an Edison screw? Edison is known from inventing a light bulb. So the Edison screw is a light bulb screw fitted as a socket. If you touch the center part - you get shocked. If someone mistakes wires and connects live to outer ring - it's even easier to get shocked. Additionally twisting the plug during screwing in causes excessive arcing and wear on contacts, wire breaks easily too. This was definitely not safe. The result was additional switches installation, but... switches were not well insulated too, nullifying the safety of socket.
The idea of putting pins to holes emerged quickly, but the problem was with contact: An idea of small brass contacts touching the ends of pins has been dropped because the contact area was too small and the plug popped out. So to maintain a good contact, the pins were all-metal and holes were metalized from front plate, so it was in fact less safe than Edison screw.
Electricity is conducted by wires and to prevent shock or short-circuit wires should be insulated. Well, good insulation was a problem since early days of electricity. Polymers were not known, Latex and rubber were hard to form into a insulating jacket, so there were two solutions: First was to spin some cotton or fabric around the wire, this survived until 1950s in inductor wires for radios. This works surprisingly well until it gets humidity from the air or gets overheated when it catches fire. The second solution was to use a dilluted shellac coating, very expensive secretion of some bugs used in furniture conditioning. Yes, a bug's poo was a life-saving factor of electrical installation. Both of these solutions were ineffective and expensive, so only user-operated cables were insulated and not all - if the cable was stiff enough not to short with anything and was inside a device, it was mostly not insulated at all, just led so it will not touch other parts. Over-air wires? The insulation was a few feet of air around.
In this picture (courtesy of Wikimedia Comons) we can see the insulation of underground cables from early 1890s. Wires were installed in separate pipes or channels and then the channel was filled with liquid tar to solidify. Although this was the best insulation among discussed here, withstanding kilovolts of arc-light energy, it was prone to shorts and leakage as these channels developed inconsistencies on junctions and water penetrated in. The latest invention in early 1900s was a piece of cloth soaked in tar, which survived to 1970s as black "electrician's tape".
And for electric motors, the enamel used for the coil wire was so poor that it was needed to put much more impregnated paper between wire rounds. The details like exposed live wires or heating elements in devices are omitted in this description as usually user was informed about it.
But these non-insulated over-the-air cables cannot easily fall off, yes? Well, these times these were not the wires we know today. In many appliances electricity was transported not using a stranded cable, which consists of a lots of small wires, but with one thick and stiff wire or a "strand" of a few copper wires and a springy, steel wire (when this wire broke, it could cut all types of insulation). When bent and twisted, the cable made of knitted wires withstands lots of cycles, but the single thick rod breaks easily, especially when is strain-hardened by installing it in screw terminals. Cables falling from screw terminals is a common problem in electrical installations even today where vibrations are present and requires additional measures to prevent it. In early days of electricity, a live, uninsulated wire falling from the air was not uncommon. Additionally, it was rare to put a cable on a support rope which is a common practice today.
Is the light bulb a simple device? Glass enclosure, Edison screw, filament, a few metal parts... not so fast. Although it is easy to make a light bulb which works, it is not easy to make a light bulb which "fails gracefully", means that in the end of life will not explode, will not cause fire or burn the generator. Modern incandescent light bulbs have low pressure inside, but with slight addition of inert gas which will not cause any fire or significant arcing. Additionally, the wires which hold the filament are made the way that if the filament breaks, these wires pull it away from the connection to break potential electric arc. As the last protection, the wire leading from the Edison screw to the filament is thin enough to act as a fuse and if filament is shorted, it burns off there. All these protections were non-existent util mid-1900s and yes, the light bulbs were exploding and shorting the circuit. A frequently used method these times, when the exploded light bulb became short, was to supply enormous electric power and wait for the shorted light bulb to burn itself out like a big fuse. Or burn the wiring. Or burn the generator.
Our electrical sockets have an additional, third contact. Along live and neutral wires there is also a protection wire, sometimes shorted to neutral. Inside many household devices this wire is connected right to metal chassis, so if there will be a short to chassis, the chassis will not become "live" and it will only blow fuses instead of shocking the user. Not only there was no such protection in early electric equipment, but also some devices had intentionally one, randomly chosen wire tied to chassis for ease of construction or because there was no other way. Going around any electrical machinery, operator had to assume that its chassis is live. Whatever we know now, we can see that these times electricity was not a safe thing.
Additionally exposed parts were just there. It was expected by user to know that they should not be touched.
The picture of a toaster shown below is from Wikimedia Commons (Photo by E. Norcross) too, be careful when flipping the toast with metal tools.
Imagine that you have a new house and want to get a new invention - electricity in it. What to choose?
(Image: courtesy of ELH) You want a bright electric light, so... maybe arc lighting? There is a company for it, which sells you few tens of kilovolts over tar-impregnated pipe-wires which you can use... but only for lighting. Heating water? Using a motor-powered tools? Nope. OK, let's stick to the filament light sources which, although more yellow, need smaller voltage. What company to choose? There is a company which is cheap, but does not endorse all household devices. Another, more expensive one supplies heaters and some electric household appliances, but only a few tools, while another one offers tools or even e.g. universal motor to power any hand-cranked machines. Voltages, AC frequencies / DC, maximum current values and connectors are of course not compatible and we are living in a pre-Tesla world, with no cheaper autotransformers to convert - buy an expensive and inefficient transformer (transformer core sheets became available in late 1890s - so we got a solid one which is more a heater than transformer!) or even more expensive "power-plant-in-your-house" motor-inductor set, but in the agreement you are not able to use them and are forced to buy electric appliances only from the electricity supplier.
Yes, these times in many countries household device manufacturers were totally under control of power companies which could deny, allow or just request more money for making these devices available for installations. If you tried to fit the device for non-agreed standard or make own plugs, it was taken as terms violation, like today crimes against "intellecshual propyerty" as lawyers call it.
And, unfortunately, it's right like today "Internet providers" who deny using services except these "blessed" by them.
The first thing, currently there is no conclusive scientific proof that high-frequency noise in amplitude like this made by 5G devices causes health problems. There was an internal military research with much higher power related to gigahertz-band radars and they excluded health diseases until someone is directly in front of the antenna during a start-up phase. The energy decreases squarely with distance, so, this is an example from Soviet Union, a high-power microwave tracking system operators which sat 20m from the antenna for 30 years got no illness, but a rat which was right on the antenna bursted in flames when the device started. And the accusations that 5G ??makes the virus?? are totally unproven as there is no physical phenomenon which allows high frequency to synthesize organic matter of such complexity. And you know what? I'm not a microbiologist, but from I see in papers, I wish there would be such phenomenon! It would solve the origin of life problem for good not only ending scientific discussions, but also allow to easily mass-produce organic substances, even organisms from the computer program and send the CRISPR manipulation method to obsolete technologies. Maybe there would be a Thingverse-like page for substances? Want a cure for hangover? Write the acetaldehyd-processing microbe and press "Generate". Want a vaccine for any microbe? Just mess with the existing code modifying the multiplication and malicious parts and press "Generate". This would be a wonder for microbiology!
So why media are talking about this totally illogical accusation? The answer is: To cover other problems, which are not such nonsense. Among them, the following are coming to top:
The higher frequency you transmit, the more power you need for the transmitter. Not only operating on the highest frequencies decreases efficiency of a typical radio device as the signal does not spread so easily, but also needs more energy to be put to bend the carrier properly (modulation). While the single transmitter is not very energy-consuming, these transmitters have to be installed in large groups. Larger number of 5G base devices will certainly increase power consumption of a city, and many cities want to trim the power expenses even now. Although 5G base stations can achieve various energy-saving modes, the problem is that there has to be a reason to enter it. Network equipment in workplaces can enter these modes usually at night, when most machines are turned off or offline. However, systems with high usage diversity like a street is not an office. Downloading large files or a night marathon with streaming will not trigger the energy saving mode as hard. And while manufacturers emphasize these modes, power companies have more realistic approach to it.
Digital devices are RF-noisy, means that they emit a noise on analog radio bands. This is also valid for 5G devices, which, poorly made, may introduce not higher harmonics, but also some intermediate frequencies on lower bands. Even today it is hard to use the analog radio around 14-25kHz because in cities the only thing you hear is the collective concert of all switching mode power supplies around. Pushing the digital high-frequency in favor of lower frequency bands causes dangerous precedents for independent communications and ham radio which becomes less and less usable.
5G is the way to communicate of the "Internet of things" devices, but makes these devices visible from the Internet. To be exact: From the outside of your local network. Because modern "Internet of things" devices are not secure and usually not updated, in some rare cases updated in a few months after release, this causes lots of insecure devices to be exposed to the publicly accessible Internet. The botnets like Mirai or Persirai are on the wild for years only because IoT devices are not secure and without known source code the software cannot be fixed. While in a local network you can just shut the control out on your router making the device not visible from the outside and govern it externally through VPN, you are not in control of 5G network. Device will nevertheless connect.
So, if manufacturers cease support and ditch the servers, the device may happily send the sensitive telemetry to some malicious third party, probably the one who registers the abandoned domain first. The device will not work as expected too, in the worst case going totally belly-up like Samsung Blu-ray players recently or a Logitech a few years ago. If there will be a vulnerability, it will be possible to mass-exploit it by script and there will be no firewall to protect against it. Right to repair? Forget it. Devices will have "Best before" dates and after that, perfectly working hardware will be just thrown away because there is no code to update.
And what Internet standard experts propose? THIS. The Internet for this, the Internet for that, but not for the user... It is legally not enforced yet, but as all such standards, it will not work without buying expensive concession like in RF world. You have a home Internet connectivity for browsing memes and writing e-mails, but want to selfhost a cloud - no way! Buy a concession for it! Or use government-controller servers.
Unfortunately, the steps taken by operators in this field would force more corporate control on users behaviour ending with entire services being censored "because security". This was already present in 2003-2006 and around 2009 when Polish telephone and Internet provider TPSA censored IRC servers because some trojan used IRC for control. Now, they will censor whatever they want because "security". Similarly, TOR is censored in some countries and people have to pretend that it's a hours-long Skype call to use the Net securely. Recently I found that someone probably paid GMail for censoring extensions of shell scripts, while for shell script to be executed, the extensions are meaningless - the magic bytes #! and attributes are important. But "that's for your security! Go buy what we force you to buy and repeat: I am free! I am free!".
The frequencies around 20-40GHz and lower 3-5GHz are actively used in military communications and location tracking technologies since at least 1990s. It would be hard to switch to newer technologies for some countries, and switching some other technologies like satellite links seems to be impossible. You cannot just send a mechanic with a toolbox and replacemet module to the orbit to catch a satellite. The problem will not be related only to interferences from the military equipment to the consumer devices, but also the other way. This way, a civilian network becomes weaponized to sabotage foreign systems.
There was a similar small "frequencies war" in late 1990s when North Korean TV transmitter used to relay their program on the same frequency than South Korean. North Korea explained that the South entered into their frequency. South Koreans explained that they announced the change months before. I don't remember how it ended.
The idea of microcells network is not new and has been used by many systems sometimes in fully decentralized way. Decentralization in such cases seems to be a nice solution fixing the single point of failure problem and, when strong, asymetric encryption is used, increasing information retention and security. In 1990s, ham radio operators in one of Eastern European countries tried with sending messages using RTTY (Radio Teletype) with lots of intermediate stations connected with each other to reach specific one - it has been shown that it is possible to do so until some "critical mass" is reached after which adding any new user must lead to drastically increased bandwidth because transmissions just collide or wait too long. There were corporate tries, like a small SMS-like communicators in some Asian countries (??later refined as a feature of Cybiko console?? - not sure about that) and some tests in USA, also by amateur radio. A few years ago I read about such tests in some older "Science et Vie" so there were tries in French-speaking part of Europe too. In all these cases tests were shut down, considered illegal and the answer for all questions was: Citizens must not do it because such usage is prohibited. Why exactly? The main argument was the "because unhealthy" unproven nonsense supported by "so many cells" argument!
So, when it was not under corporate control, it was unhealthy, and now is OK?
Now, after Snowden's leaks, we know that the government's answer really was "Because we will not be able to effectively intercept all communications when the encryption will come". But an average user does not know. So one day the governing Party says that 2+2=5 and the next day that 2+2=3.
So let's go to the end of this rant. The media operation of making the insecure by design solution acceptable in everyday life has been well planned and is a dangerous precedent in introducing risky technologies. In media, none of the earlier arguments are present. What do the media show? Burning 5G transmitters because it "spreads the disease". This puts all reasonable criticizing in the same bag as illogical "spreading-the-disease" and allows to put any dangerous technology to common usage censoring any critical voices by manipulated people.
It should also be noted that most people who propagate the equality between any criticism of 5G solutions and a manipulative variant cannot be easonably discussed with as they are most likely trolls hired by companies who want other criticism to be extinguished. In the modern Internet, there is no truth, as the "truth" can be bought by bot farms. Almost nobody does the independent publishing as to carry modern content systems, heavy servers are needed which cost a lot per month. This must return in profits, and the best method to get profits is sensational writing.
This manipulation technique is already visible by giving the "get-out-of-jail card" to many businesses, especially medical and transport. If an engineer designs a bridge, this bridge should withstand lots of usage cycles. Base rock, vibration, temperature cycles, winds, seismic conditions, hydrological conditions - all of these are evaluated and tested, also independently by another labs. However, bridges still collapse. There may be some kind of miscalculation. Some factors may have been not included. Some factors were just unknown, there was an error in calculation or there were problems with maintenance. But in all of these cases, the first responsible are the engineers.
The argument "but look how many people successively passed the bridge, they would drown in the river instead if they tried to swim in it!", working perfectly while explaining lack of medical or transport companies responsibilities, suddenly stops working with an ordinary bridge!
OK, I was joking about this end :-).
Using the Internet with 5G, with its "Everything is an app" always-connected corporate approach which can be better described as "user-as-a-commodity", is another step to make the Internet not for users, but for companies. No place for people's data, all space for advertisement. No time for creation and exchanging ideas, full time for tribalization, meaningless flamewars between brands of the same company fans and clickbait news sites. No security for user, full security for brands. Instead of variety of services under different conditions, a permanent risk of "deplatforming" as the common taboo is maximizing the profit of company at all costs. One more time: Currently the security is not for user, is for a brand. The Internet has been created as a medium for exchanging knowledge and communication. This way, even when entry level was high in a moment, users could learn from available materials to get it. Spam, tracking and marketing was always a secondary thing. The problem with marketing is that it has to be for everyone and it has to be right now. It must not be too clever, because people with smaller education will not get the advertisement and educating them by clear documentation is not only outside of marketing's scope, but also harmful for advertisement itself. Because of that, modern Internet lacks more and more knowledge sources exchanged for the expansion of web stores and advertising platforms. Recently a lot of "old grade" websites got shut down, like FAQs, manuals, blogs or repair instructions. The motivation was different - sometimes it was more and more expensive hosting when disk space costs fall down (like M.H.'s page about SWTPC computers), sometimes it was a need to evolve into a shop so everything which could not be sold has been deleted (like some well-known Spectrum site), sometimes, (like "Slate Star Codex" blog) it was a direct threats to the Author by a bigger newspaper. Hello, a mob incited by government is still a government intervention dear Americans, you know?
As people are more and more the property of they employers, they cannot use the publishing part of the Internet even in their free time, as suddenly they are representatives of the Employer all time. Guess do they get a money for full-time representation of their masters? :-).
Fighting with meritocracy, recently seen in Open Source communities to be more "welcoming", is also a totally missing idea. If the meritocracy does not allow to gain proper knowledge to be part of it, it's no different than a typical "company-customer" relation and is not a meritocracy in modern sense at all. However, the thing in Open Source movement was around gaining the knowledge by using published documentation, code and "FAQs", so even beginner could develop. Having good entry documentation for beginners is the best thing to make a welcoming community and it makes people who just want to give something in return act more and better. The problem is - now this is also marketing. You must not educate as the customer must not be too clever. Additionally, more time spent on communication equals more profiling and advertisements, so storing documentation is less profitable than "discussion" platform frustrating experts who need to explain the same thing again and again, going with further separation to castes.
Currently all Internet publishing is controlled by advertisers. No brand wants to see its ads along ultra-right or ultra-left political discourse, epidemic death reports or mobbing against those who may buy the product (in Open Source world, the similar taboo is the wall of text and charts in the manuals). However, by polarizing, the Social Media did exactly this. So now companies are doing a show and officially resign from buying ads which nevertheless would harm them, while still having total control over what is published. A nice PR operation is always better as allows to excuse imposing of self-censorship.
So the bigger power in modern Internet is related to bigger potential of censoring its past. Censoring things written before companies introduced more "proud" and "shame" campaigns and things which are not in advertisement doctrine. Things which now can be sold to people as have been robbed from them and excissed from libraries. Let's look at the list of sites removed from Internet Archive, maintained by Archive Team. While it was usually for everyday shoutboxes, shady businesses, poorly-protected publishers, torrent-bases or small pages with weak transfer or connection, now it contains more and more sites of legitimate companies! They are just trying to censor their promises from the past. And who has bigger capabilities in rescission of the "old" Internet, has bigger power in "current" one. This causes even bigger collapse when it comes to the initial role of the Internet, and in fact introduces a reign of censorship: a "censorocracy", let's call it.
The question is: Is it really still viable to do anything there? Or maybe it should be just abandoned, like ham radio or amateur TV stations in 1980s?