thewayne: (Default)
I have this really nifty Energizer battery, has a digital display that you press a button on the side and it tells you how charged it is. Two USB-A ports on top, one of those trapezoidal presumably USB-C ports on top, and a proper round USB-C port on the side.

I'm in Bethesda, Maryland right now at National Instututes of Health, and yesterday my iPad was a touch low, around 40%, at lunch, and I had another 4+ hours in the afternoon of appointments and reading while I was idle. So I plug in a C to C cable into the side port to charge my iPad.

And it drained the iPad trying to charge the battery!

Fortunately I noticed the alert on the iPad when it hit 10% and warned that power levels were low. Unfortunately I left my laptop at the hotel to save weight being schlepped around for 9 or 10 hours at the clinic. I made it through the day okay.

Tonight, in an attempt to determine if I just don't know how to use the battery or if it's mis-wired. I plugged in an A->C cable and plugged it into my iPhone 16. THAT seems to work properly! The battery is down 10% and the phone is up about 3%. A bit disproportionate, I admit.

So I finally looked up the web page for it. Heaven forbid it's on the energizer.com web site! I had to search for the model number, 'energizer ue10068'. And there it confirms the two USB-A ports on top are the output ports, the other two are the charging ports.

I would say that maybe someday I'll start reading manuals, but we know that ain't gonna happen. :-) At least I now definitely know what sort of cable I need to carry in my backpack if I expect to use my portable battery.
thewayne: (Default)
The legend? The Zilog Z80 CPU.

Talk about a heck of a run! Could you imagine Intel still making Pentium II's today? But that Z80 has kept truckin' along for almost five decades! Talk about an incredible design. While it was a general purpose CPU like those made by Intel and AMD and others today, its low power consumption and well-understood programming and foibles made it very popular for embedded device controllers. I told a friend of mine who thinks he's a tech geek and holds a degree in EE, and he'd never heard of it! He's slightly younger than me, but not that much, he was never a generalist. The Z80 was a backbone for the C/PM and M/PM operating system and S100 bus architecture, which was what computing was done on before in the '70s and '80s until the IBM PC and Mac began revolutionizing and bringing it all to the rest of us.

From the Techspot article: "Federico Faggin, an Intel engineer, founded Zilog in 1974 after his work on the Intel 4004, the first 4-bit CPU. The Zilog Z80 was then released in July 1976, conceived as a software-compatible "extension" and enhancement of the Intel 8080 processor.

Developed by a team of just 12 people, the Z80 saw remarkable success, leading Zilog to establish its own chip manufacturing plants and expand to over a thousand employees within two years. Like its Intel counterpart, the Z80 was originally designed for embedded systems but went on to become a significant milestone in gaming hardware from the 1970s to the mid-1980s."


and

"Several home computers and gaming consoles were built around the capabilities of the Z80, including Sega's Master System and SG-1000, and Nintendo's Game Boy and Game Boy Color. Many classic arcade games also used the Z80, including the original version of Pac-Man. Additionally, the 8-bit processor was common in military applications, musical synthesizers like the Roland Jupiter-8, and various other electronic devices."

So pour one out - but not on! - the Z80.

While the Z80 is going away, its legacy lives on in the eZ80 and newer iterations of the classic chip.

https://www.techspot.com/news/102684-zilog-discontinuing-z80-microprocessor-after-almost-50-years.html

https://hardware.slashdot.org/story/24/04/20/1916203/the-legendary-zilog-z80-cpu-is-being-discontinued-after-nearly-50-years
thewayne: (Default)
Storage cards are insane, and I am very appreciative at how the prices have come down. My new camera I have two 256 gig cards, I have my camera write the same image to both cards simultaneously for backup in case one card fails, and since I'm recording in JPEG-only, most of the time it's showing that I have room for over 10,000 images.

I don't think I'll be buying a 4 TB card any time soon.

Not to mention, how many batteries would you need to fill a card with that many images?!

There's a curious thing mentioned in the article that is roundly ridiculed in the article, and one thing that is very troubling. The curious thing is the transfer speed. Could be better. Looks like these cards may not be ideal for high-speed shooters, but we'll know more closer to when they come out and start getting tested. The other problem is that Western Digital bought SanDisk, and people have been very unhappy with their memory cards of late, experiencing phantom failures where the card just dies for no apparent reason.

Hence my deciding that since my camera has two memory card slots, why not give myself some redundancy.

And no, my cards are not SanDisk, they're PNY.

Still, an interesting development in memory cards. Could be very beneficial to people who produce video.

https://arstechnica.com/gadgets/2024/04/sd-cards-finally-expected-to-hit-4tb-in-2025/
thewayne: (Default)
To hell with those decadent Westerners and their XBoxes and PlayStations and their LGBTQ propaganda! We're going to go with solid Russian craftsmanship and story-telling!

Now, I have absolutely no doubt that Russian game developers could create some truly compelling stories. Every nation has great story-tellers. And there's no doubt that they have great programmers, though far too many are involved in cybercrime. I think their goal of producing such a console by a '26-'27 deadline is perhaps overly ambitious, but hey, what do I know?

https://gamerant.com/russia-gaming-consoles/

The Slashdot comments are amusing:
https://games.slashdot.org/story/24/03/29/2244215/russia-is-making-its-own-gaming-consoles


Oh! I know what I know! Russian chip foundries are foundering with a chip packaging defect rate of 50%! And that's PACKAGING the chips, not MAKING them. The Chinese are making the chips, sending them to Russia for PACKAGING. So the Russians are receiving discs (I assume) that contain hundreds of chips that have to be precisely cut up, then packaged into housings with leads attached for later integration into circuit boards and such for use in various electronic devices.

Except they can't reliably, in large batches. Apparently they can do small batches okay, but large batches are beyond their ability.

The problem seems to be quality control, calibration of the devices, and workforce skill set.

Clearly first-world problems.

Oh, I forgot. Russia isn't a first-world nation. Except they have nukes, and a seemingly nutso war-monger leader. There are lots of brilliant scientists and engineers in Russia, and I feel sorry for them living in such constraints. We've had several Russian astronomers who've worked at the observatory, and I've worked with Russian programmers before. Brilliant people, once you figure out how to work with the language barriers.

This is why I mourned the turn they took when the nutjob former KGB station chief became the leader. I knew he'd never let go. They had a chance of turning around Russia when the USSR fell apart, they had a chance of becoming a free nation and elevating themselves, but then the criminal class took over and it became a kleptocracy, and it's now a mess.

I really can't see things improving until there's another October Revolution and the people literally seize the state again, which will be a massive bloodbath. Maybe they can start over, maybe the criminal class will simply seize power again.

It's a little unclear as to whether these chips are strictly consumer-grade or intended possibly for military use. A lot of military applications don't need anything much more sophisticated than an 8088, but when you're talking drones or night vision goggles, you're needing much later chips and packaging.

https://www.tomshardware.com/pc-components/cpus/half-of-russian-made-chips-are-defective-baikal-struggles-to-meet-russias-demand
thewayne: (Default)
This article appeared about a month ago. It's pretty cool. Using an Arduino, some C and Javascript coding and a little motor control, he has the laser scanning in one axis while being moved along the other axis and producing a monochrome image.

I haven't watched the video in the Gizmodo article, I'm curious if he harvested anything from the Bluray player aside from the laser and maybe the power supply. It would probably be better to buy a stepper motor controller for this purpose than to try to repurpose one from a player, but at least you'd know that the power supply from the player would meet the laser diode's specs and you could probably tap it to provide power for the rest of your gadget.

https://gizmodo.com/blu-ray-player-scanning-laser-microscope-hack-youtube-1849914455

https://tech.slashdot.org/story/22/12/21/2245214/old-blu-ray-players-can-be-turned-into-microscopes

May 2025

S M T W T F S
    1 23
45678910
1112 131415 1617
18 19 20 212223 24
25262728293031

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated May. 28th, 2025 03:24 am
Powered by Dreamwidth Studios