thewayne: (Default)
The code was written by Joseph Weizenbaum, a German Jew whose family fled Nazi Germany for the USA and studied at Wayne State in Detroit. He wrote ELIZA in a programming language that he created called MAD-SLIP, Michigan Algorithm Decoder Symmetric List Processor, in only 420 lines of code! It was quickly translated into Lisp, a language well-regarded for AI work. His work developing MAD-SLIP earned him an associate professor slot at MIT where he ultimately wrote ELIZA, that post became a tenured professorship in four years. He also held academic appointments at Harvard, Stanford, the University of Bremen, and elsewhere. He passed away in 2008 and is buried in Germany.

From the article, "Experts thought the original 420-line ELIZA code was lost until 2021, when study co-author Jeff Shrager, a cognitive scientist at Stanford University, and Myles Crowley, an MIT archivist, found it among Weizenbaum's papers.

"I have a particular interest in how early AI pioneers thought," Shrager told Live Science in an email. "Having computer scientists' code is as close to having a record of their thoughts, and as ELIZA was — and remains, for better or for worse — a touchstone of early AI, I want to know what was in his mind." But why the team wanted to get ELIZA working is more complex, he said.


They go on to talk about building an emulator to simulate the computers from the 1960s to run the code properly, and discovering and deciding to keep in place a bug in the code.

Pretty cool stuff. And only 420 lines of code!

https://www.livescience.com/technology/eliza-the-worlds-1st-chatbot-was-just-resurrected-from-60-year-old-computer-code

https://slashdot.org/story/25/01/18/0544212/worlds-first-ai-chatbot-eliza-resurrected-after-60-years


Weisenbaum was an interesting person with some cool philosophies regarding computers and AI, of which he had some apprehensions. Two movies were made about him, he also published several books. His wikipedia page is worth a read, IMO.

https://en.wikipedia.org/wiki/Joseph_Weizenbaum
thewayne: (Default)
I wrote about this quite a bit when it was happening a year ago.

One year ago to pretty much the day, Unity, a video game development system, announced that they were going to start charging a runtime fee, or basically for every copy of the game sold. Previously it was sold on the basis of developer installations, and in many cases was free to tiny development studios or solo developers.

The blowback was indescribable, with many studios threatening to stop development in Unity or to switch to other development platforms. There were resignations within Unity at the top levels and their stock took a pummeling. Speaking of which, the announcement took place not long after they did their IPO.

Now they've decided that they're reverting to their per developer install model, and as of the start of the new year, will be increasing the cost of their Pro and Enterprise licensing tiers. Those levels come with increased level of support and access, making them somewhat worth the money paid.

Of course the big question is: how much did their subscriber base shrink due to this incredibly stupid move? If I were a developer and had been burned by them, then switched to another platform, I wouldn't go back. Of course, if the other platform was not meeting needs and expectations, that's a different issue. Lots of things to consider. Fortunately that is not the type of programmer that I am, and I am ever so glad that I am not! Commercial video game production is a horrible industry!

https://www.gamedeveloper.com/business/unity-is-killing-its-controversial-runtime-fee

https://tech.slashdot.org/story/24/09/12/1615225/unity-is-killing-its-controversial-runtime-fee
thewayne: (Default)
First off, it has to be pointed out this is a specialized AI model designed for programmers, not a generalized model like ChatGPT et al.

IBM trained it specifically on open source libraries to which they explicitly had permission, basically bending over backwards to avoid any possible legal issues. And they now have a working model that they've released to the public! Granite was trained on 116 different programming languages and has from 3 to 34 billion tokens, presumably per language. I wonder if you can ask it to list all the languages it's trained in, I'll bet there's some pretty esoteric ones in there! I'd love it if it had MUMPS! (I once found a book on MUMPS programming at the Phoenix Public Library, I imagine it's been weeded by now)

Anyway, interesting article. It describes how it was trained, etc., but one of the more interesting bits was saying that in the rather short time since ChatGPT et al have appeared and everyone started creating their own LLMs, the cost for training up an LLM has dropped from millions of dollars to thousands! That's a pretty impressive scale drop.

https://www.zdnet.com/article/ibm-open-sources-its-granite-ai-models-and-they-mean-business/

https://www.zdnet.com/article/ibm-open-sources-its-granite-ai-models-and-they-mean-business/
thewayne: (Default)
The organization Women Who Code provided scholarships to tens of thousands of women around the world during its time and was also useful to help women get a proverbial foot in the door through peer networking. The Board announced that effective pretty much immediately, the org is dissolving due to its inability to attract funding.

I started working in IT around 1983 or so. I've never seen a lot of women programming or in IT in general, but the ones that I've worked with were always quite good programmers and IT people. I'm sorry to see them go.

https://www.bbc.com/news/articles/cw0769446nyo

https://tech.slashdot.org/story/24/04/19/2024202/women-who-code-shuts-down-unexpectedly
thewayne: (Default)
This is a good move. They're migrating some core code libraries from C# to Rust. C# is not truly based on C and C++, it has characteristics of several languages. The language is pronounced SEE-SHARP and they wanted to use the musical sharp symbol, similar to the #, but it doesn't exist on pretty much every keyboard in the world, so they compromised and went with the name but the pound symbol.

The job is described as "...include "guiding technical direction, design and implementation of Rust component libraries, SDKs, and re-implementation of existing global scale C# based services to Rust."

The goodness is that Rust is a very tight language when it comes to memory strictness. Lots of languages are pretty loose when it comes to enforcing memory allocation and access, Rust is definitely not. This looseness is what gives hackers open doors to lots of systems. And this is why Linux is now rewriting a lot of its core systems in Rust from C and C++. Microsoft is doing the same thing with the Windows operating system. But this is a very slow process as there's far fewer Rust programmers than C/C++ programmers, so it's a slow slog.

If you know anyone who says they want to be a programmer, and they're serious about it, Rust and systems programming would be a very financially rewarding line to explore. Hard work, but well-paying.

https://www.theregister.com/2024/01/31/microsoft_seeks_rust_developers/
thewayne: (Default)
It's called Intel One Mono, and at my first glance, it's pretty decent, though I'm not too sure that I care for the lower case L. They have an extremely unrestrictive license for it, and lots of instructions on how to download it from Github and incorporate it into most major editors.

https://www.omgubuntu.co.uk/2023/06/intel-one-mono-font

https://github.com/intel/intel-one-mono

https://developers.slashdot.org/story/23/06/10/030224/intel-open-sources-new-one-mono-font-for-programmers

May 2025

S M T W T F S
    1 23
45678910
1112 131415 1617
18 19 20 212223 24
25262728 293031

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated May. 30th, 2025 03:22 am
Powered by Dreamwidth Studios