@natecull The real issue is technological change outpacing society's speed of adaptation.
As environments change, people develop etiquette, laws, religious traditions & stories to pass on what behaviours work & which values we need to remember.
Companies have just been responding to what's favoured by the social, legal and social context. So there's a feedback loop with both positive and negative consequences.
Maybe we need to increase the rate of social response.
I don't really buy this. Technological change has slowed down substantially since its peak in the 70s (to the point that most of what we, as individuals and even as early-adopters, run into as 'new' technology' is really 70s tech that finally became profitable), & smaller groups had bigger shifts in tech for decades.
We *are* seeing the effects of certain tech at a larger scale than before, but mostly, we're seeing the effects of capital-amplifiers.
There's no qualitative change happening in, say, ad targeting. Ad targeting works exactly the same way it did in 1995 (and exactly the same way folks were expecting it to eventually start working in the 70s, when computers & statistics were first being applied to the problem).
There's a quantitative change happening, which is that we reached the physics-theoretic peak of speed for integrated circuits 15 years ago, & we're working on getting everybody on the grid.
And until that bubble collapses (which nobody really wants, because it'll take the global economy with it, because most of the economy is just gambling on futures of futures of futures of ad valuations for novelty t-shirts and other trash), the reaction is that everybody in that industry doubles-down and makes promises about how they'll get two tenths of one percent more likelihood of a sale from three gigs more targeting data per person.
This isn't a 'new' phenomenon at all. It's the inevitable result of following the original 1970s script. And, you'll find people -- not even necessarily terribly technical people, but essayists and science fiction authors -- writing in the 60s, 70s, 80s, 90s, about this script and talking about its end-game (which we are living through) because all it takes to predict it is an unwillingness to buy into the hype.
@enkiv2 @natecull Some parts of the dystopian imagination of the 70s and so on were well conceived, like John Brunner's idea that the successful modern humans would be the ones best at adapting to hyper rapid change, or his prescient ideas about crowdsourcing knowledge.
But those visions are always lopsided, seeing one set of forces without anticipating the counterforces equally well.
The 1920s reference is something I've been thinking about too. It feels like we're in a very similar spot.
In some ways, the smartphone is just the final (or interim) delivery on what 'radio' promised in the 1920s. Took a while to get batteries, transmitters, aerials, small enough, and then layering computers over the top was something nobody quite imagined then. But the definite feeling was in the air. "what if... telecommunication?"
@natecull @byron @enkiv2
In my humble opinion, there's a lot of exploration to do, but we are stuck in this post-PC phase where the convenience of having our systems online & available outweigh the apparent advantages of having computing be personal, be thing we can change & control & orchestrate & customize.
We are all trapped, bound to online cloud mainframes that offer us only a small slice of the world they contain & computer, where all powers we receive must be baked into the application layer & there is no cloud os we can expect.
The challenge seems obvious, that we are all online, but supplicants. We could have a million responses to disinformation, to bad actors, try & discover what works to grow healthily together, but we are locked onto these giant properties, reliant on them to give us all tools & systems for socialization about this hostile environment. We must begin to be of our own minds, bring our own minds online. #noospherics
> John Brunner's idea that the successful modern humans would be the ones best at adapting to hyper rapid change, or his prescient ideas about crowdsourcing knowledge.
> But those visions are always lopsided, seeing one set of forces without anticipating the counterforces equally well.
But right now there are less than a dozen corporations who have their own cloud, who have the basis to begin to adapt & explore & adventure. The rest of us monkeys scratching about in the dirt can use these tools to advance ourselves, but at great expense, & with limited control & greatly restricted understanding. For corporations, these restraints are not so bad, but the de-personalization places a hard limit on the individual & their expression & adaption.
Still, it *must* be a short-term solution. I'd love to see a growth of mesh computing, both physical networks (wifi) and P2P cloud computing.
The value of CPU time keeps dropping right? If you could sell spare CPU cycles to a P2P cloud network and it ran your computer down a little faster, you'd come out ahead.
And we'd get cloud computing power without the monopolies.
i don't demand we go p2p right now, or even accounting. as much as innovate, i'd like some catch up for the rest of the world.
i would like to be able to run some servers & cloud services that others can use & share. we're starting to get to the point where i could run a #k8s cluster in some kind of multi-tenant way, but it'd be a ton of assembly & we're not really there yet.
figuring out the peering & market systems sounds good, but i see those as extra atop an un-opinionated multi-tenant cloud infra, infra we are only beginning to begin to be able to emerge for ourselves.
Centralization creates economies of scale that are only useful when centralizing. A little bit of duplication, properly distributed, is not noticeable to those who it is distributed to, while duplication of the whole of a network's data by some centralized owner can easily bankrupt the owner. I don't see why we should bother with this centralization at all. We have the tech to avoid it.
If you are already centralized, there are structural incentives to double down and become even more centralized, and economies of scale are part of that.
If you are not centralized (not even federated), then none of that stuff applies. Much easier to run everything off a raspberry pi hooked up to the wifi of the coffee shop down the street than pay amazon to let you access their nightmare of overlapping security groups.
As soon as you admit any centralization (even so much as a client-server model), you're trapped by an inevitable logic that leads you to exactly the things we are complaining about in "big tech", & you either go all the way and become the bad guy or you fail earlier.
If you avoid that centralization, however, you've got a lot of flexibility in creating and responding to incentives. You don't need to get subsumed by capital.
I really hope this is true! I've always *felt* it to be true, even way back in the 80s era of cassette tapes and modem BBSes. It always felt like we were the pioneers of a new underground and there was this vast potential for radical decentralisation.
but, lol, I spent all my time online downloading games, and most of my programming time making games, and not even great games. And now I don't even do much programming and what little I do seems to be harder
It's pretty dizzying when you think about the sheer computing power in just an ordinary laptop today.
I have access to a vast collection of books, photos, videos. But organizing that collection is hard, and doing meaningful and useful things with it is even harder.
Websites are pretty terrible infrastructure for spinning up new community groups. It would be great if we could get to 'Facebook group' level of 'just make something happen'. Without the spam.
My personal attention span is shot this year. I really need to at least finish Tex/Tix 0.2. But just sitting down to code up a tiny parser drains my cognitive resources.
I wish we had computing tools that could make thinking itself easier. That's what I've always wanted. Not running all my data through some snooping cloud AI to look for patterns, but just... doing something small and local, letting me create my own patterns in my workflow.
So much of what we do on a modern desktop seems really resistant to automation - because it's all about stitching together tasks performed in multiple app silos that don't share a common language or data model - that it just seems odd when you think about it. Why did we build the desktop this way, so non-user-programmable? Or... a scarier idea... would a really user-programmable desktop actually be a nightmare because everyone's would be subtly different?
@natecull @enkiv2 @jauntywunderkind420 Desktops are hard to program for many reasons. The Alto apparently was really object-oriented under the hood. But I think it's actually a hard problem on the one hand, and not enough users cared on the other. Apple did a decent job at user-oriented GUI scripting with AppleScript tbh, but it never caught on massively even within Apple fandom.
You see so many attempts at tying app silos together, inc. IFTTT Yahoo web pipes, CORBA.
That is, the job isn't just "do X" but also "stay excited about X." On the bigger level X might not just be that project, but coding in general. You might need to take a break from that project and just *play* with code a bit, whatever that means. Being an obligation you're not excited about can be draining.
They're linked, though.
Back in the day, the go-to explanations for why users didn't have control over application behavior were:
* "our source code is a valuable business secret, and keeping it secret makes us better than competitors"
* "interpreted languages are too slow"
* "end users can't be trusted with compilers"
All of that is clearly bullshit or irrelevant no.
But a corporation can still take their local modifications of open source code, stick it on a machine they own or rent, and write some interpreted code to interface with it -- and boom! web app, which they can modify whenever they want and which end users can't even see the whole of.
Mandatory network connection is part of keeping users away from meaningful control of these systems.
You get different forces when there's huge value to be unlocked in a "wild west" frontier: the decentralizing forces that have been powerful at times, like IRC, email, the P2P sharing revolution, hopefully the fediverse; and centralizing forces.
My worry is the "wild west" will be "won" by big gov and big corps.
I mean, imagine if every Linux user just agreed on ONE Linux distro to support in evangelizing to newbies? Unity has power.
Tech frontiers tend to start chaotic and create a few big winners.
But *networking* tech has the *potential* to disrupt that pattern and make decentralization cheaper, more efficient and more powerful.
There's a lot to be said for systems that don't have to be online to work (net-work). Breaking that often presently necessary coupling is I agree pretty important, if only because not everyone can keep a pi at a coffee shop but they still deserve to be able to serve & help the net if they fall off it.
But past that definition, I feel like most notions of distributed/decentralized are alluring sugar pop dreams that mask over how rudimentary even centralized tools, with full super-ops powers, are at enabling user creativity & imagination. Distributed is not entirely but largely orthogonal to a vaster more alarming impotency.
@jauntywunderkind420 @enkiv2 @natecull Git's a good example of the way systems can be built to function both online and offline, and it's good that even web apps like GMail try to handle loss of connection temporarily.
I agree, good decentralized systems should be able to handle a variety of network levels, from "fuck it, I'm airgapped" to "I'm travelling a lot" to reliable high speed connections.
But the networks are where the most power and potential, especially when we think P2P clouds.
@jauntywunderkind420 @natecull @enkiv2 You're talking about it in negative terms, but actually as much as I prefer decentralized and private computer power, it's actually *amazing* the power that big cloud computing puts cheaply at the disposal of even individuals and small companies.
One simple example is the way that if you only occasionally need massive spikes of CPU and RAM resources, clouds can give that without you buying it or waiting hours or days. Eg. for Blender renders, compiling...
i speak as much of the applications running & their massive, at-scale, inflexible, corporate run nature as i do the physical infrastructure of the cloud.
i acknowledge the technics you speak of, you're not wrong. yet this current balance of power is grossly grossly inhumane & deeply restrictive of the human consciousness. these social media applications we interact with are far off, unintelligible aliens to us, deeply impersonal entities, with spectacular haunting capabilities that give enormous hard-to-recognize power to strangers.
you talk about us having technical capabilities, but i don't see them being marshaled to help create independent pervasive-onlineness that competes with the heavily saturated smattering of social networks much of humanity has integrated themselves into. agreed, access to cloud infra is good, yet it's not being used to compete. we here are some of the few counter-examples.
< ( Did you know that in 2020, Dick Tracy really will have a radio watch? )
( Golly gee whillickers, far out, man! ) >
< ( It also tracks his every move, sending it to shady organizations he’s never heard of without his permission or knowledge, and neither he nor anyone he knows is allowed to know how his phone works or what it’s doing, with a jailable penalty for knowing too much. )
( Uhhh… ) >
< ( Also it’s too big for his wrist. We just force each other to all carry around a purse now. )
@enkiv2 @natecull It's a basic arms race like these things always have been. What's interesting in marketing isn't really ads anymore, no matter how much counterfeit AI you apply. It's social media now, blurring gaps between conversation, relationships and sales, like salespeople have always done, but on jet fuel.
Sure, and then tie that into ads to seal the deal. But ads aren't the focal point.
@enkiv2 @natecull I do agree that actual technological change has slowed in absolute terms but social impacts haven't. The world after mainframes existed wasn't hugely different from before that, but always on Internet in your pocket? That's huge.
The other part is that even 1920's tech was crazy fast in evo terms. We're still reeling from radio and navigating dating in a post-pill world. We could use 50 more years just for that.
Social media? Much longer.
I believe real technological change and progress has decreased greatly since the mid-20th century, and arguably from an 1875--1925 peak.
A sequence of advances, in engines (steam/coal, ICE/petroleum, turbines, jet propulsion), in electricity, in coal and petroleum chemistry (each lagging fuel applications by ~50+ years), in telecoms, switch-based logic, and data storage, have continued to play out, with consequential but still ultimately derivative advances.
At the same time the societal effects through economics, industry, business, politics, and culture have been prfound, with periods of revolutionary change. Technical skills / skills obsolescence in particular has been profound, especially for workers. Precariousness has increased for many. Old power structures and allegiances are threatened or crumbling.
Oh, and most technology is amplifying, generally of power.
A Mastodon instance for Rubyists & friends