Hacker Newsnew | past | comments | ask | show | jobs | submit | more MereInterest's commentslogin

To any other readers, if nxobject’s post reads as a non-sequitur, it is because bitlax edited their post from its original “Only applies to citizens.” to “ Citizens certainly do.” shortly after posting.


> Only applies to citizens.

If the right to a fair trial is restricted from any person, it may also be restricted from every person, by claiming that that person is part of a group not afforded a fair trial.

If you were arrested, how would you prove your citizenship without a trial?


You are correct that no such permission is required to use, lend, or resell a book. It would be unethical for a seller to impose a requirement for such permission. By the poster’s analogy, it is similarly unethical to impose a requirement for permission prior to the owner’s use, lend, or resale of a computer. Since Google sells computers that cannot later be used without Google’s permission, Google is imposing such an unethical requirement.


> Pedantism is the lowest form of pseudo-intelligence.

You can’t just lay this bear trap of an opportunity and expect me to not pedantically state that the word is either “pedantry”, the activity performed by pedants, or “pedantic”, to describe such activities.

“Pedantism” would be a philosophy or viewpoint that extols pedantry. Pedantism would be to pedantry as deontology is to rule-following, a justification of an activity. As such, pedantism would be a slightly higher form of pseudo-intelligence than mere pedantry.

But only slightly.


I added Pedantism to my spell checker, so now it's not red anymore. Checkmate.


Not to be confused with pedentation which is

> indenting or quoting yourself in a way that makes it look more authoritative


Nitpick: O(n^2) is quadratic, not exponential. For it to “increase exponentially”, n would need to be in the exponent, such as O(2^n).


To contrast with exponential, the term is power law.


Yup. The most prominent I can remember are the Beats By Dre headphones, where about a third of the weight is from metal parts that are either purely decorative, or whose functionality does not at all depend on being made of metal, based on a 2020 teardown [0,1].

[0]: https://beneinstein.medium.com/how-it-s-made-series-beats-by... [1]: https://beneinstein.com/how-it-s-made-series-yup-our-beats-w...


> (note: I have no intention of using gold for beverage containers... I like my skin not blue)

I thought gold was biologically inert. Any chance you’re thinking of argyria, which is caused by exposure to elemental silver?


Gold salts (e.g., old arthritis treatment) can lead to gold poisoning, but I was really just making a joke. If you have a gold chalice in your cupboard, I think you’ll be alright.


> The wear and tear on tarmac is directly related to the weight of the vehicles that use it.

From empirical studies, damage to the road is proportional to the fourth power of axle weight. A bike with rider may weigh 200 pounds, where a passenger car weighs around 4000 pounds. That 20x difference in weight results in a 80,000x difference in damage to the road.

(That’s not even getting into semi trucks, which are around 40 tons fully loaded. Split along 5 axles rather than 2, that’s 9x the axle load of a passenger car, leading to 6,500x the damage to the road relative to a passenger car, or 520 million times that of a bike.)

[0] https://en.wikipedia.org/wiki/Fourth_power_law


Yeah, and that model is just wrong unless wear from axle weight is the dominant lifetime limiter. There are many other lifetime limiters (like tree roots pushing up from below), and when road engineers plug in bicycle axle weight into their usual formulas you get designs that barley last a season - even when they aren't used at all.


If the contract doesn’t correspond to what the salesperson said, wouldn’t the company still be on the hook for false advertising and/or fraud?


Not a data scientist, but my understanding is that restricting the set of training data for the initial training run often results in poorer inference due to a smaller data set. If you’re training early layers of a model, you’re often recognizing rather abstract features, such as boundaries between different colors.

That said, there is a benefit to fine-tuning a model on a reduced data set after the initial training. The initial training with the larger dataset means that it doesn’t get entirely lost in the smaller dataset.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: