Pinned toot

meta, several subtoots, no specific persons, negative 

if I don't know you and you are having a discussion in the federated timeline and I (or some other stranger) join in, it's alright if you can't read anything I say, but if you tell me to shut up just to be edgy and rude (and because I exist outside of your pecking order), you are an asshole and it wouldn't fly anywhere else other than fedi.

Pinned toot


Learn how in 3 easy steps:

1. Checkout telekommunist

2. Checkout why ACSL people didn't bother to do step 1

3. Learn to use an interactive programming environment or find another one suited to your needs

@Gnuxie stfu techie, computers were a mistake and also I'm reducing your abstraction quota immediately, enjoy thinking long and hard about your problems in raw C you indulgent FP person

Show thread

"Today we have machines that are a thousand times faster than in the 1980s, and the ability to couple thousands of machines together. Not many conclusions are valid across six orders of magnitude; whatever difficulties the field encountered in decades are almost certainly irrelevant.

What is important, however is that we remain cognizant of the history of programming languages.In particular, education is crucial; students need to be aware of a many different ways of programming, lest they reinvent the wheel and reinvent it badly. We must not let the languages of the present obscure our view of the past, because it is the great languages of the past that can lead us to the languages of the future" -- Gilad Bracha

The Signal developers assume a central party must maintain a protocol, and they must control all implementations to do so.

The Matrix developers assume a false dichotomy, where protocols provide coherence of user experience or extensibility and not both.

I assume a decentralised protocol with no meta-protocol for extending its behaviour is no good for deploying extensions.

It's not as symmetrical as *that* toot, but this is fundamentally why I can't recommend either.

re: silly, unironic it really happened 

I tried to stop them splitting by memorizing the object map and slot descriptions in the add slot primitive I made but I figured that this only works if you add slots in the same order for all objects, which should work out alright still, but I found it a bit bodgey

so of course I took it a step further by creating a 'special object where the object map shared between instances and gets mutated' and uhh yeah

Show thread

silly, unironic it really happened 

**spends months designing a prototype based object system** wohoo this is fun haha no more stinky metaclasses, parent slots go brrrrrrrrrrrr BRRRRRRRRRRRRRRRRRR :blobbun_mlem:​

kewl, now let's add annotations to our system :blobbuntonguewink:​

**object maps start splitting* *

nuuuuuuuuuuuuuuuuu how do i stop this nuuuuu oh nuuuuuuuuuuuuuu :blobcatsadreach:​

**invents classes**

ahh, that's better, finally some good fucking foo- wait a minute-


re: silly 

*connects to dev database* *dramatic shakey noise* Holy fucking shit- this is personal information in the dev db *sigh*

What's that flashing?

It's a production error

A what? Let me have a look at this...

Holy mackerel-

Show thread


kitchen nightmares but it's companies making software

aaaaaaaaaaaaaaa noise 

me, not doing what I'm supposed to be doing but avoiding, and running out of things to distract myself with.

Rule of thumb: choosing a programming language 


bullshitting maths to look smart 

Let f(x) be the lines of code in a program of x complexity.
We could define f(x) of the form x*e^-n, where n is the number of abstractions you use. Clearly as n approaches infinity (well, even a small number like 5 is pretty close), f(x) approaches 0 for any x. So make more abstractions!
Fit the language to your problem and write code that clearly does something tangible! Abstractions exponentially decrease the non-tangible stuff.

subtoot, toxic tbqh 

does anyone know if go runs on the z80?

does anyone know if my 50 year old GC design (that I chose because I don't want to read any FP nerd papers because FP nerds are ruining computers) and greenthreads appropriated from those FP nerds are really close to the hardware or are those indulgent abstractions?

All programming languages are basically the same, so I'm going to do backwards chaining in C, use an interactive debugger and late binding in Haskell, make infinite data structures using laziness in Common Lisp, and write a network driver in Prolog.

"One final comment. Hardware is really just software crystallized early. It is there to make program schemes run as efficiently as possible. But far too often the hardware has been presented as a given and it is up to software designers to make it appear reasonable. This has caused low-level techniques and excessive optimization to hold back progress in program design. As Bob Barton used to say: Systems programmers are high priests of a low cult."

A cumulative distribution function graph of the lines of code of cl-decentralise2 and Netfarm. It's honestly quite interesting to see, but I don't know what to make of it other than I have some very long files.

Today is the third birthday of Nettle, the predecessor to Netfarm that was penned as a "lightweight distributed database". While not entirely interesting in itself, developing it allowed me to put ideas about distributed programming and collaborative filtering into context. It was also one of my first collaborations with @Gnuxie and @deborahcliff; the latter also wrote the Cooperative Software License because I forgot about source distribution requirements with...

when you see someone call 2,000 connections on an expensive server a DDOS

Show more

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!