I’ve gotta be honest, I’ve never actually used vi (or similar) for anything more than basic text editing. But it’s always been the one editor that was available on any linux system I used. Just the basics, which are pretty easily learned once:
move the cursor around (ok, occasionally I have to look up what I have to put in the config file to get compatibility mode, but…)
basic editing mode
save a file and quit
quit without saving a file
search for text
Pretty much the only “advanced” thing I occasionally do in vi is use a regex for text replacement. I’ve accidentally used some of the other features occasionally, but that’s a one-off.
The few times I’ve had to use nano… well, it’s prettier, but it doesn’t really make things easier.
I mean… the worst I’ve seen is accidentally inserting a letter multiple times because I started typing before getting into insert mode. A quick quit-without-save/reload usually rights things if I do something accidental. But, YMMV
I have used vi (or more likely vim) when I have had no other choice. I’ve learned enough of it to get by, but I’ve never enjoyed using it.
I’ve used nano (and pico before it) because it gets installed by default and is usually available, even on BSD. I find it generally inoffensive, with most of the necessary functionality being clearly spelled out.
If I’m going to spend much time working with text in a terminal, I’m going to install emacs. I have no justification for it other than I got into it at one point in my life and I’m just used to it now. I’ve never been particularly productive in it, and I’ve never really gotten to the point where I can copy and paste without using the menu or just pasting into the terminal, but it still feels better than the other two.
That said, if I’m working with text in a terminal, it’s generally just to modify a config file here or there. If there’s anything that’s larger or actually important, then I should be checking it into source control. If I’m doing that, then there’s probably a better editing experience to be had locally. These days that usually means VS Code. It’s lightweight by today’s standards, as well as free and open source, and it supports just about anything I want to throw at it with very little trouble.
The Vintage Computer Festival Midwest happened recently:
I’ve been watching some of Veronica’s videos recently. She doesn’t cover vintage topics exclusively, but her enthusiasm and personality are extremely infectious. I think she adds a really nice perspective to this event.
LGR has a different perspective, both from being a previous attendee and also being an exhibitor. His points about growing pains are definitely relevant, and I hope the organizers are thinking about how they can address those for future events.
Overall I think this would be a fun event to attend, and seems extremely in line with the mood of this thread.
As Plotkin notes, the interpreter source code doesn’t have a lot of interesting, personal, or other revealing comments or artifacts. It does contain some unintentional commentary on what it was like trying to produce commercial software in the 1980s:
There’s a bunch of internal documentation about creating disks for the various platforms. Remember that in the 1980s, floppy disks were pretty incompatible between platforms. To write a C64 disk, you had to get the game data and interpreter onto a C64 which could then write it to disk. But how did you do that? No Wifi, no Ethernet port… Infocom’s solution was to run a serial cable from their DEC-20 (where all the games were developed) to the C64 (or wherever). The serial transfer program is called “TFTP” in most of these folders. Do strings like com1:9600,n,8 turn you on? You might be a serial port!
Was kind of a toss-up between this going here and in the “get your game on” thread.
I’ve been surprised at how long lasting the demand for DVDs is. I have a client who sells his online courses as a DVD and book combo in addition to streaming, and about half the people buy the DVD/Book version. I didn’t even know that many people still owned a DVD player.
I finally gave my external DVD player (Mac SuperDrive) to my mom. She has the complete set of Perry Mason DVDs, and enjoys re-watching them. Now that she has given up cable, it’s the easiest way for her.
At our house, we went on a hunt to see if we had a DVD player, since we were considering renting Oppenheimer. But as it turns out, we could only “rent” the stream. But there IS a DVD player in the house. It has never been used.
We prefer DVDs to streaming as it has more permanence. I don’t like the streaming business because it requires a subscription–and which one do you choose? Britbox for BBC, or Paramount for Star Trek? Or what??? So we’ll probably still be getting DVDs, at least as long as they’re available.
We also have Perry Mason DVDs but haven’t re-watched them in a while. We have watched “Love Actually” every Christmas for quite a while now. That or “When Harry Met Sally” on New Year’s Eve.
Streaming was a pretty good option when it was just Netflix doing it… you might not find what you wanted at a specific time, but it was a single payment and pretty broad selection. Main problem being that it was difficult to predict how long something would be accessible or what would be accessible.
Since everyone else started wanting a piece of the pie, everything’s gone downhill fast. Now there’s multiple walled gardens that all want their own “exclusive” content, to force you to give them ever-climbing subscription fees for less content and even less certainty of how long you can access that content. Disney’s recent decisions to drop content that had only recently been put out, for instance… and they’ve made themselves the only (legal) place to stream their shows, too.
Which is why I have my own Plex server, and mostly buy physical copies and rip them for myself. Not as easy to get new content, but at least I know everything on it will remain available as long as I care about it. But if physical copies end up getting phased out entirely, things are going to get downright dystopian.