Discover and read the best of Twitter Threads about #DNN

Most recents (3)

The impressive deep pattern recognition abilities of #DNN's such as #LLM's are sometimes confused for reasoning abilities

I can learn to guess, with high accuracy, whether a SAT instance is satisfiable or not, but this not the same as knowing how to solve SAT. Let me explain. 1/
Suppose you train a learner with a large number of Boolean 3-SAT instances labeled with whether or not they are satisfiable. There is no reason to doubt that a modern #DNN-based leaner will manage to learn deep features corresponding to the γ ratio-- #clauses/#variable .. 2/
..and armed with γ, it can also essentially figure out the sharp-threshold phenomenon w.r.t. to γ, and should be able to predict with high certainty that the γ < 4.3 are satisfiable and γ > 4.3 are unsatisfiable. 3/ Image
Read 10 tweets
Thank you to @BloorStreetCap for yesterday's insightful #uranium discussion with some of the pioneers of the industry. A fantastic line up with in depth insight into all aspects of the #uranium sector.

Some of our highlights can be found the the thread below:
1/ “The world is recognising the science of nuclear power, not the false ideology.” – #NexGen CEO

“Price of #equities is screaming buy relative to the #spot price” – @uraniuminsider

“We’re on the cusp of a really really big move here [spot price]” – @UraniumEnergy
2/ #Sprott are seeing growing #institutional and #FO interest in the #uranium space. Very early stages of broad adoption by generalist #investors. Ciampaglia believes we are ‘just starting the second inning of this uranium cycle’.
Read 12 tweets
#Highlights2021 for me: our #survey on efficient processing of #sparse and compressed tensors of #ML/#DNN models on #hardware accelerators published in @ProceedingsIEEE.
Paper: dx.doi.org/10.1109/JPROC.…
arXiv: arxiv.org/abs/2007.00864
RT/sharing appreciated. 🧵
Context: Tensors of ML/DNN are compressed by leveraging #sparsity, #quantization, shape reduction. We summarize several such sources of sparsity & compression (§3). Sparsity is induced in structure while pruning & it is unstructured inherently for various applications or sources. Various sources induce stru...Common structures of sparsi...
Likewise, leveraging value similarity or approximate operations could yield irregularity in processing. Also, techniques for size-reduction make tensors asymmetric-shaped. Hence, special mechanisms can be required for efficient processing of sparse and irregular computations.
Read 12 tweets

Related hashtags

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!