Hacker Newsnew | past | comments | ask | show | jobs | submit | davidmnoll's commentslogin

Location: San Franciso

Remote: Open to remote / in person / hybrid

Resume: https://davidmnoll.github.io/assets/David_Noll-Resume-982cc7...

Front End: TypeScript, React, Vue, Redux, Prisma, Jest, Cypress

Back End: Python, Node.js, Django, Flask, PHP; Some Scala, Haskell, Java, Rust

AI: LLM Integrations, Clustering with Embeddings, prototyped semantic search

DevOps: AWS, Azure, Docker, SQL, Linux; Some Terraform, Kubernetes, AWS CDK, Ansible

Software Developer with 10+ years full stack & infrastructure experience. Recent work also includes prototyping and exploring AI applications. Background in cognitive science with focus in cognitive linguistics.


Location: SF Bay Area Remote: Open to remote / in person / hybrid / relocation

Resume: https://davidmnoll.github.io/assets/DavidNoll-Resume-1d19078...

Core Skills: Python, Typescript/JS/Node.js, React, SQL, Postgres, PHP, Some Java, Some AI, LLM/GPT integrations

Experienced Full Stack Developer with early startup experience and broad background. Available to start immediately.


Location: SF Bay Area

Remote: Open to remote / in person / hybrid

Resume: https://davidmnoll.github.io/assets/DavidNoll-Resume-1d19078...

Core Skills: Python, Typescript/JS/Node.js, React, SQL, Postgres, PHP, Java, Web3/blockchain, Django, LLM/GPT integrations

Developing Skills: Rust, Functional Programming, Haskell, Scala, C++, Solidity, Data science / statistics

Experienced Full Stack Developer with early startup experience and broad background. I have experience in many domains and am able to learn new domains quickly. I am available to start immediately.


Right but in chemistry class the way it’s taught via Gibbs free energy etc. makes it seem as if it’s an intrinsic property.


Entropy in physics is usually the Shannon entropy of the probability distribution over system microstates given known temperature and pressure. If the system is in equilibrium then this is objective.


Entropy in Physics is usually either the Boltzmann or Gibbs entropy, both of whom were dead before Shannon was born.


That's not a problem, as the GP's post is trying to state a mathematical relation not a historical attribution. Often newer concepts shed light on older ones. As Baez's article says, Gibbs entropy is Shannon's entropy of an associated distribution(multiplied by the constant k).


It is a problem because all three come with a bagage. Almost none of the things discussed in this thread are invalid when discussing actual physical entropy even though the equations are superficially similar. And then there are lots of people being confidently wrong because they assume that it’s just one concept. It really is not.


Don't see how the connection is superficial. Even the classical macroscopic definition of entropy as ΔS=∫TdQ can be derived from the information theory perspective as Baez shows in article(using entropy maximizing distributions and Lagrange multipliers). If you have a more specific critique, it would be good to discuss.


In classical physics there is no real objective randomness. Particles have a defined position and momentum and those evolve deterministically. If you somehow learned these then the shannon entropy is zero. If entropy is zero then all kinds of things break down.

So now you are forced to consider e.g. temperature an impossibility without quantum-derived randomness, even though temperature does not really seem to be a quantum thing.


> If entropy is zero then all kinds of things break down.

Entropy is a macroscopic variable and if you allow microscopic information, strange things can happen! One can move from a high entropy macrostate to a low entropy macrostate if you choose the initial microstate carefully. But this is not a reliable process which you can reproduce experimentally, ie. it is not a thermodynamic process.

A thermodynamics process P is something which takes a macrostate A to a macrostate B, independent of which microstate a0, a1, a2.. in A you started off with it. If the process depends on microstate, then it wouldn't be something we would recognize as we are looking from the macro perspective.


> Particles have a defined position and momentum

Which we don’t know precisely. Entropy is about not knowing.

> If you somehow learned these then the shannon entropy is zero.

Minus infinity. Entropy in classical statistical mechanics is proportional to the logarithm of the volume in phase space. (You need an appropriate extension of Shannon’s entropy to continuous distributions.)

> So now you are forced to consider e.g. temperature an impossibility without quantum-derived randomness

Or you may study statistical mechanics :-)


> Which we don’t know precisely. Entropy is about not knowing.

No, it is not about not knowing. This is an instance of the intuition from Shannon’s entropy does not translate to statistical Physics.

It is about the number of possible microstates, which is completely different. In Physics, entropy is a property of a bit of matter, it is not related to the observer or their knowledge. We can measure the enthalpy change of a material sample and work out its entropy without knowing a thing about its structure.

> Minus infinity. Entropy in classical statistical mechanics is proportional to the logarithm of the volume in phase space.

No, 0. In this case, there is a single state with p=1 and and S = - k Σ p ln(p) = 0.

This is the same if you consider the phase space because then it is reduced to a single point (you need a bit of distribution theory to prove it rigorously but it is somewhat intuitive).

The probability p of an microstate is always between 0 and 1, therefore p ln(p) is always negative and S is always positive.

You get the same using Boltzmann’s approach, in which case Ω = 1 and S = k ln(Ω) is also 0.

> (You need an appropriate extension of Shannon’s entropy to continuous distributions.)

Gibbs’ entropy.

> Or you may study statistical mechanics

Indeed.


>>> Particles have a defined position and momentum [...] If you somehow learned these then the shannon entropy is zero.

>> Entropy in classical statistical mechanics is proportional to the logarithm of the volume in phase space [and diverges to minus infinity if you define precisely the position and momentum of the particles and the volume in phase sphere goes to zero]

> [It's zero also] if you consider the phase space because then it is reduced to a single point (you need a bit of distribution theory to prove it rigorously but it is somewhat intuitive).

> The probability p of an microstate is always between 0 and 1, therefore p ln(p) is always negative and S is always positive.

The points in the phase space are not "microstates" with probability between 0 and 1. It's a continuous distribution and if it collapses to a point (i.e. you somehow learned the exact positions and momentums) the density at that point is unbounded. The entropy is also unbounded and goes to minus infinity as the volume in phase space collapses to zero.

You can avoid the divergence by dividing the continuous phase space into discrete "microstates" but having a well-defined "microstate" corresponding to some finite volume in phase space is not the same as what was written above about "particles having a defined position and momentum" that is "somehow learned". The microstates do not have precisely defined positions and momentums. The phase space is not reduced to a single point in that case.

If the phase space is reduced to a single point I'd like to see your proof that S(ρ) = −k ∫ ρ(x) log ρ(x) dx = 0


I hadn't realized that "differential" entropy and shannon entropy are actually different and incompatible, huh.

So the case I mentioned, where you know all the positions and momentums has 0 shannon entropy and -Inf differential entropy. And a typical distribution will instead have Inf shannon entropy and finite differential entropy.

Wikipedia has some pretty interesting discussion about Differential Entropy vs Limiting density of Points, but I can't claim to understand it and whether it could bridge the gap here.


> So the case I mentioned, where you know all the positions and momentums has 0 shannon entropy

No, Shannon entropy is not applicable in that case.

https://en.wikipedia.org/wiki/Entropy_(statistical_thermodyn...

Quantum mechanics solves the issue of the continuity of the state space. However, as you probably know, in quantum mechanics all the positions and momentums cannot simultaneously have definite values.


> possible microstates

Conditional on the known macrostate. Because we don’t know the precise microstate - only which microstates are possible.

If your reasoning is that « experimental entropy can be measured so it’s not about that » then it’s not about macrostates and microstates either!


> In Physics, entropy is a property of a bit of matter, it is not related to the observer or their knowledge. We can measure the enthalpy change of a material sample and work out its entropy without knowing a thing about its structure.

Enthalpy is also dependent on your choice of state variables, which is in turn dictated by which observables you want to make predictions about: whether two microstates are distinguishable, and thus whether the part of the same macrostate, depends on the tools you have for distinguishing them.


A calorimeter does not care about anyone’s choice of state variables. Entropy is not only something that exists in abstract theoretical constructs, it is something we can get experimentally.


that's actually the normal view, with saying both info and stat mech entropy are the same is an outlier, most popularized by Jaynes.


If information-theoretical and statistical mechanics entropies are NOT the same (or at least, deeply connected) then what stops us from having a little guy[0] sort all the particles in a gas to extract more energy from them?

[0] https://en.wikipedia.org/wiki/Maxwell%27s_demon


Sounds like a non-sequitur to me; what are you implying about the Maxwell's demon thought experiment vs the comparison between Shannon and stat-mech entropy?


This matches up well with my hypothesis that language developed as a mediator of institutional behavior and not the other way around. https://spacechimplife.com/institutional-code-and-human-beha...


Stating the mentalese hypothesis as fact is a bit tenuous


Sure, like most areas involving intelligence, much work is yet to be done. However, mentalese or the Language of Thought Hypothesis, occupies a much stronger position than the hypothesis that human language itself is our fundamental engine of reason, which is almost assuredly not true. This last fact has serious implications for the valuations of the majority of AI companies.


As i replied to the parent comment: this was at least 200k years before speech if you go by the hyoid bone evidence


This was at least 200k years before the advent of speech if you go by the hyoid bone evidence.


What if the first language was a sort of sign language, and vocalizations were only auxiliary, optional? With time, humans who vocalized had a better chance to be understood, and their vocal tract evolved. Humans instinctively gesture to this day when speaking.


This has been hypothesized, but the fact that there aren’t many or any purely sign languages still around and other primates don’t show signs of using signs makes that seem like a reach, IMO


>other primates don’t show signs of using signs

other primates don't show signs of using vocal language either, yet humans have it

>there aren’t many or any purely sign languages still around

we're the only species of modern humans who survived, maybe Neanderthals/Denisovans etc. used sign language, who knows

maybe the fact that we started using vocal language made us much more superior evolutionarily speaking, making other similar species without developed vocal tracts extinct (by our warfare/assimilation)

deaf/mute people around the world have always historically come up with ways to talk using signs (different unrelated systems), i.e. we still have the means to do it, but it's unnecessary when you can produce sounds (freeing your hands to do work)


Yes these are the arguments in support of the idea. I’m not going to dismiss the idea completely but it’s not at the top of the list of likelihood imo


I would consider this evidence that language predates human vocalizations. We already know deaf children will invent signs and gestures to communicate, and language not dependent on a hyoid bone. Do we have any way of dating the relevant neural structures through genetics?


I think we’d have trouble because we’d have to tie the gene to a specific linguistic cognitive function. My hypothesis is that humans configured themselves into self-replicating group structures I’d call institutions, and language evolved as a way to facilitate that. These institutions exhibit all the thermodynamic properties of life, and they have goal directed behavior independent of individual humans.


FOXP2 was thought to be the genetic basis for language, but this has been overturned[1]. As far as I know there isn't a good candidate for a gene selected for language.

[1] https://www.the-scientist.com/language-gene-dethroned-64608


Location: Louisville, KY Remote: Yes

Willing to relocate: Yes

Resume: https://davidmnoll.github.io/assets/pdf/DavidNoll-Resume.pdf

Email: davidmnoll@gmail.com

Github: https://github.com/davidmnoll

Experienced full stack developer, looking for position in collaborative culture in a technical environment.

Front End: Typescript, React, Jest, Cypress

Back End: Python, Django, Flask, Node

DevOps: AWS, Docker, databases, CI, Terraform, Kubernetes


Location: Louisville, KY / SF Bay Area, CA

Remote: Yes

Willing to relocate: Yes

Email: davidmnoll@gmail.com

Resume: https://davidmnoll.github.io/assets/DavidNoll-Resume-9d9913c... Github: https://github.com/davidmnoll

Experienced full stack developer, looking for position in collaborative culture with a strong team.

Previously lead engineer at NYC start-up. Led remote team of 3 other developers. Set up CI/CD pipeline & testing. Worked on problems around scaling, testing, maintainability, security as well as new feature delivery.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: