Hacker Newsnew | past | comments | ask | show | jobs | submit | canucker2016's commentslogin

  The man, who was in his 50s and otherwise healthy, showed up at a hospital after the entire left side of his body abruptly went numb and he was left with clumsy, uncoordinated muscle movements (ataxia). His blood pressure was astonishingly high, at 254/150 mm Hg. For context, a normal reading is under 120/80, while anything over 180/120 is considered a hypertensive crisis, which is a medical emergency.

  Four weeks into his recovery, he was on five different drugs to try to bring down his blood pressure. At that point, doctors pushed for more lifestyle information from the man, who finally revealed that he had a habit of drinking an average of eight high-potency energy drinks every day.

  Each of the 16-ounce drinks was labeled as containing 160 mg of caffeine, a stimulant that can raise blood pressure. For reference, there’s about 90 mg of caffeine in a normal cup of coffee. So eight of these energy drinks would be 1,280 mg, the equivalent of a little more than 14 cups of coffee a day. But the doctors point out that the labeled amount of caffeine was only the “pure caffeine” amount. Such energy drinks contain additional ingredients, like guarana (a plant native to the Amazon used as a stimulant), that can contain “hidden caffeine.” Guarana is “thought to contain caffeine at twice the concentration of a coffee bean,” Coyle and Munshi write.

I know a guy that used to drunk a lot of frappe[0], more than 10-15/day IIRC, so friggin addicted till the day he had a seizure while driving...

[0]https://en.wikipedia.org/wiki/Frapp%C3%A9_coffee


FTA:

  A CBC Toronto reporter rode the entire 10.3-kilometre line from east to west Monday morning, finding it took roughly 55 minutes to complete. As a reference point, over 400 runners ran this year's Toronto Marathon 10-kilometre event in under 55 minutes.

  CBC Toronto's eastbound return trip to Finch West Station was about eight minutes shorter, clocking in at roughly 47 minutes. Still, several riders Monday told CBC Radio's Metro Morning that the previous bus route on Finch Avenue W. was faster and had more stops along the way, making it easier to access.
Even at the scheduled 47 minutes end-to-end time, athletic teens wouldn't have a problem outrunning the LRT.

A fount of knowledge about Microsoft's productivity application group history is Steve Sinofsky's blog, https://hardcoresoftware.learningbyshipping.com/

For p-code references, the relevant blog post is https://hardcoresoftware.learningbyshipping.com/p/003-klunde....

  There was a proprietary programming language called CSL, also named after CharlesS. This language, based on C, had a virtual machine, which made it easier to run on other operating systems (in theory) and also had a good debugger—attributes that were lacking in the relatively immature C product from Microsoft. 
CharlesS is Charles Simonyi, ex-Xerox PARC employee, hired away by Microsoft and worked on MS Word as well as creating the Hungarian naming system (the Apps version is the definitive version, not the bastard watered-down Systems version used in the Windows header files) - see https://en.wikipedia.org/wiki/Hungarian_notation.

The blog post included excerpts from internal MS docs for apps developers. An OCR version of one such page in the blog post follows:

  ====

  One of the most important decisions made in the development of Multiplan was the decision to use C compiled to pseudo-code (Pcode). This decision was largely forced by technological constraints. In early 1981, the microcomputer world was mainly composed of Apple II's and CP/M-80 machines; they had 8-bit processors, and 64K of memory was a lot; 128K was about the maximum. In addition, each of the CP/M-80 machines was a little different; programs that ran on one would not automatically run on another. Pcode made the development of ambitious applications possible; compiling to machine code would have resulted in programs too big to fit on the machines (even with Pcode it was necessary to do a goodly amount of swapping). It also allowed us to isolate machine dependencies in one place, the interpreter, making for very portable code (all that was necessary to port from one machine to another was a new interpreter). For Multiplan, this was an extremely successful strategy; it probably runs on more different kinds of machines than any other application ever written, ranging from the TI/99 to the AT&T 3B series.

  Of course, Pcode has its disadvantages as well, and we've certainly run into our share. One disadvantage is that it's slow; many of our products have a reputation for slowness for exactly that reason. There are of course ways to speed up the code, but to get a great deal of speed requires coding a goodly amount in tight hand-crafted assembly language. Another disadvantage is our Pcode's memory model. Since it was originally designed when most machines had very little memory, the original Pcode specification supported only 64K of data; it was not until Multiplan 1.1 was developed in early 1983 that Pcode was extended to support larger data spaces. A final disadvantage of Pcode is that we need our own special tools in order to develop with it; most obviously these include a compiler, linker, and debugger. In order to support these needs, there has been a Tools group within the Applications Development group almost from the beginning, and we have so far been largely unable to take advantage of development effort in other parts of the company in producing better compilers and debuggers. (It should be noted that the Tools group is responsible for considerably more than just Pcode support these days.)

  Although portability was one of the goals of using Pcode, it became apparent fairly early on that simply changing the interpreter was not sufficient for porting to all machines. The major problem lay in the different I/O environments available; for example, a screen-based program designed for use on a 24 by 80 does not adapt well to different screen arrangements. To support radically different environments requires radically rewriting the code; we decided the effort was worth it for two special cases: the TRS-80 Model 100 (first laptop computer) and the Macintosh. In retrospect, the Model 100 was probably not worth the effort we put into it, but the Macintosh proved to be an extremely important market.

  ====
Jon DeVaan's comment to the blog post mentions, https://hardcoresoftware.learningbyshipping.com/p/003-klunde...:

  P-Code was a very important technology for Microsoft's early apps. Cross platform was one reason, as Steven writes. It was also very important for reducing the memory size of code. When I started, we were writing apps for 128k (K, not m or g) RAM Macs. There were not hard drives, only 400k floppy disks. (Did I mention we all had to live in a lake?)

  P-Code was much smaller than native code so it saved RAM and disk space. Most Macs had only one floppy disk drive. A market risk for shipping Excel was requiring 512k Macs with two disk drives which allowed for the OS and Excel code to live on the first drive and user's data on the second. Mac OS did not have code swapping functions, each app had to roll its own from memory manager routines, so the P-Code interpreter provided that function as well.

  On early Windows versions of Excel the memory savings aspect was extremely important. The size of programs grew as fast as typical RAM and hard disk sizes for many years so saving code size was a primary concern. Eventually Moore's Law won and compilers improved to where the execution trade-off was no longer worth it. When Windows 95 introduced 32 bit code these code size dynamics returned for a different reason – IO bandwidth. 16 bit Excel with P-Code outperformed 32 bit Excel in native code in any scenario where code swapping was needed. Waiting for the hard drive took longer than the ~7x execution overhead of the P-Code interpreter.
Another Jon DeVaan comment, https://hardcoresoftware.learningbyshipping.com/p/008-compet... :

   I am surprised to hear Steven say that the app teams and Excel in particular were looking in any serious way at the Borland tools. The reality was the CSL compiler had a raft of special features and our only hope of moving to a commercial tool was getting the Microsoft C team to add the features we needed. This was the first set of requirements that came from being the earliest GUI app developers. Because of the early performance constraints a lot of "tricks" were used that became barriers to moving to commercial tools. Eventually this was all ironed out, but it was thought to be quite a barrier at the time. About this time the application code size was starting to press the limits of the CSL P-Code system and we really needed commercial tools.
And Steve Sinofsky's reply:

  Technically it was the linker not the compiler. The Excel project was getting big and the apps Tools team was under resource pressure to stop investing in proprietary tools while at the same time the C Tools group was under pressure to win over the internal teams. It was *very* busy with the Systems team, particularly the NT team, on keeping them happy. We’re still 5 years away from Excel and Word getting rid of PCode. Crazy to think about. But the specter of Borland was definitely used by management to torment the Languages team who was given a mission to get Microsoft internally using its tools.

Back in 2014, Titanfall's disk space was 75% UNCOMPRESSED audio (35GB of 48GB) for the gamers with only dual-core CPUs.

from https://www.escapistmagazine.com/titanfall-dev-explains-the-...

  “On a higher PC it wouldn’t be an issue. On a medium or moderate PC, it wouldn’t be an issue, it’s that on a two-core [machine] with where our min spec is, we couldn’t dedicate those resources to audio.”

It's also bullshit. Half Life 2 didn't need uncompressed audio. MP3 decompression is damn near free.

Half life also did it with half the required minimum cores, and 1 ghz less clock speed on that CPU. It released a decade before Titanfall 1. Sure sure, it's got so much more going on, but uh, that much?

For a reference of how trivial it is for CPUs to decode MP3 files, software decoders take tens of MIPS. Remember that unit? Less than a percent of one of those minimum spec CPUs required.

You know what's funny? The source engine only supports MP3 compressed audio. Do you know what titanfall 1 downloads and decompresses to to create 30gb of audio data? Lossy compressed, 160kb/s OGG format audio.


It was BS considering countless other games having no problem with sound. Decoding something like Opus takes ~30MHz of a single CPU core[1], meaning even an unreasonable situation of decoding 16 simultaneous uninterrupted 128Kbit Stereo streams would only eat half of one core.

[1] iPod Classic (1998 era ARM9) decodes 128 kbps stereo Opus at ~150% real time at stock cpu frequency. Opus is not the lightest choice either https://www.rockbox.org/wiki/CodecPerformanceComparison#ARM


Ah, that explains how the R1Delta guys were able to cut the size on disk to less than 20gb! https://github.com/r1delta/r1delta


Grug is a real dev, those memes weren't merely joking.

Isn't COD infamous for uncompressed assets as well, or am I misremembering?

I just assumed that the keyboard designer was a left-handed accounting/Excel geek (numeric keypad on the left side of the keyboard).


Eddie Murphy talked about the three movies that he regretted turning down:

- Ghostbusters (he was filming Beverly Hills Cop at the same time)

- Rush Hour

- Who Framed Roger Rabbit


Pro athletes also have higher divorce rates than the general population - 60-80% vs 50% source NYTimes/Sports Illustrated


Eudora was open-sourced in 2018.

see https://computerhistory.org/blog/the-eudora-email-client-sou...

and from https://en.wikipedia.org/wiki/Eudora_(email_client)

  The last 'mainline' (pre-OSE) versions of Eudora for Mac and Windows were open-sourced and preserved as an artefact by the Computer History Museum[2] in 2018; as part of the preservation, the CHM assumed ownership of the Eudora trademark.

  The only actively maintained fork of the software, known as Eudoramail as of June 2024, originates from 'mainline' Eudora for Windows as preserved by the CHM. Hermes, its current maintainers, describe Eudoramail 8.0 as currently being in alpha; Wellington publisher Jack Yan, meanwhile, points out its stability, a number of well-characterised and reproducible display bugs notwithstanding.
from https://en.wikipedia.org/wiki/Eudora_(email_client)#Hiatus_a...

  On May 22, 2018, after five years of discussion with Qualcomm, the Computer History Museum acquired full ownership of the source code, the Eudora trademarks, copyrights, and domain names. The transfer agreement from Qualcomm also allowed the Computer History Museum to publish the source code under the BSD open source license. The Eudora source code distributed by the Computer History Museum is the same except for the addition of the new license, code sanitization of profanity within its comments, and the removal of third-party software whose distribution rights had long expired.
recent news, see https://en.wikipedia.org/wiki/Eudora_(email_client)#Under_He...


The time period under discussion ("before Thunderbird", and the heyday of Outlook lock-in, and I would also add before gmail) is well before 2018.

I used mutt at the time too, but I don't think it's in the same category as the graphical clients. For a while Gnome's evolution was also big in free OS circles.


Eventually, and I was glad to see it!, but way too late for it to matter much. I would've used Eudora when it was originally offered. Since I couldn't, I got comfortable with Thunderbird. And when my friends who used Eudora had to migrate off of it, I set them up with Thunderbird, too.


My parents like to tell the story about me getting bored at kindergarten class one day.

I grew up in a residential area in a city of several million people.

The teachers had let the kids out for recess. But even that amount of playful distraction didn't diminish my boredom that day.

So I went home.

In the middle of the school day.

Without the teacher finding out...

I had to cross a stroad to get home - two lanes each way. I can't recall if I crossed at the street light or at a crosswalk a few blocks away. But I made my way home unscathed. My mom was surprised when I showed up at home a few hours early.

The next few days at school, I could feel the teacher's eyes boring into my back as I played during recess. Definitely felt like I was being watched for awhile. :)


At a party, I heard one person mention that a doctor had told him that he had BOTH north american lyme disease AND european lyme disease.

Lucky him.

He was at the doctor's office because of severe symptoms that he suspected were due to lyme disease since he had received several tick bites years ago.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: