I'm going to tell you.
Something that's going to sound so
obvious that
you're going to wonder why even bother
with the idea is basically this.
If something is watered it takes
less information to describe it
OK So for example.
Groups.
Hope to do this.
That should better it.
OK so it's obvious to everybody here OK
that the power is more
disordered than the monthly yeah
right no question now I can't quantify for
you which one you like better but
I can quantify it for
you which one is more ordered and
the way you do this naturally
like you do everything nowadays
is to get the information or
to get the answer from
your cell phone OK but
it's not the usual place that you
look to solve this problem or
your cell phone OK instead of calling
a Google or something like that
you simply take pictures of the images
with the camera on your cell phone.
And then you save them and depending on
the cell phone you have you can save them
without compression which is what
this bitmap is over here that just is
the number of bits in the picture OK and
here is mold.
And white because the some overhead
in these programs Here's a lossy
compression program which is not good you
want something which is lossless which
means that when you recover the image
you get the entire image back it has and
smooth things out it hasn't lost
information and you just look at the size
of the files and P.M.G.
is a loss this compression algorithm OK.
I subtract from the compression
Politan the month of
the white OK And what I find is the.
Is three killer bytes.
The Pollock is a megabyte.
I can quantify for
you now that this is more ordered
than that simply from the in from.
That's needed in order to
store it was the slate.
OK So the basic idea them to
tell use the following The more
organized ordered something is the less
information you need to describe it
people have looked at this problem before
how much information need in particular
to encode a sequence to send
a transmission Shannon worked on that and
you guys know about the shot and chirpy
and all remind you that there's a much
more to me a much more clever thing that
was done although that's pretty clever
what Shannon did which is by coma
gorup this is the same commodore of
the DID turbulence He's also the guy
that wrote mathematics textbooks which
make the Russians much better than
the rest of us in the West in math OK and
he said the real measure of
entropy of information should be
the size of the smallest
computer code you can write
to regenerate at that information
exactly Now the problem for
this with with this is the fact that
it's the smallest and it's hard to get
the smallest On the other hand this lease
the idea that you can write a computer
program which includes your information
and it's always going to be slightly
larger then this or
maybe much larger than this that
is the length of a lossless the compressed
data set is an approximation if you like
to the Commodore of complexity OK And
the question is if.
This is compressed data
set is like the coma or
of complexity is like the Shannon
entropy then you have a measure of
the entropy in your system simply by
the length of the compressed file and
that's easy to get I should
remind you I want to
actually is there a clock around so I can
keep track of time there isn't is there.
OK so I probably won't run way over but
you got to stop me anyway.
So I go out of the program to show you
every single file in your computer you
click on it you right click on that and
they give you a list of options what you
do with it one of the options is compress
what I'm telling you is if you have a data
file like from an experiment you can
take that data file you can push compress
and you compare compare two files and
find out which is more
ordered simply by doing that.
That won't give you something
completely quantitative will help so
now the question is how Ward How good
are these compressor How good are these
lawsuits compressions and why might you
expect something like this would work in
the air Surya's our entire society
is built on compression and
that the compression it's used for
communication computing Sokrates cell
phone streaming data storage everything
depends upon that and the people
that write these compression programs
these lossless compression programs you
have to have at last the city doing the
stock trades right you don't want to lose
information and money in the stock trade
there's lots of money resting on this and
they know what's a limit is they know the
limit of the compression is the Shannon
and shipping OK So they want to
get as close as they can here is
what the information industry is
worth at least in two thousand and
twelve it's worth much more now but
it's like three by now maybe five trillion
dollars OK and a lot of that is
compression OK so what I'm going
to tell you is work that they have
of being the ones that Martin Jani.
Are Doing this is the collaboration
with them looking at this problem and
now you're probably saying to yourself
OK OK this is going to give me
something roughly I could compare and
I can say this.
Is bigger than this or
something that the answer is No you
can get quantitative information and
you can look at systems and really find
out what's going on with him here for
example is a model system a dynamical
system is a dynamical models a sandpile
model man a model is a model Were
you do is you have a grid we throw
particles their own on it if there's more
than one per site that site is active
you take the particles and you empty
that site in neighboring sites and
you keep on doing this until all of your
sites are singly occupied like that.
And now this shows you some of the
dynamics of that system what you do is you
throw the particles down randomly
OK with different densities over
here you compress the file
you plot out here the length
of the file per particle
normalized per particle OK And
then you let it evolve with
time until it's looking for
the state where I have low double
occupancy right and as a function of time
it gives you this curve after maybe ten to
the six iterations or something it finds
absorbing States over here that is states
that have solved this problem OK And
over here it finds only active states it
can never find the state of high enough
density where it's only singly occupied
even though such states exists and
this cusp over here tells you this is
second order phase transition dynamical
face transition in this case but this is
show you that it's not just going to be
something general you can get quantitative
information out of this and plot it
OK I should mention it also working on
this problem but in a different aspect but
using compression to get
affectively entropy of their system
is a group a tele be one of the guys
as a post op in my group and
he's somewhere over there.
OK luckily they're working on protein
folding and work in an active systems so
we won't get in each
other's way OK Shannon
to find entropy probabilistically you guys
know this he will seen this before for
one thing everybody tells
you when you do and
should pay OK you can write
the entropy as minus P.
lot that's essentially
what Sharon showed and
you also shown from the source code
in theory that you can't translate
information what's the sli at lower
than the Shannon entropy The only
problem with this is it's only meaningful
actually for an infinite ensemble and
first stationary process OK that is
that things don't vary with time here's
Kolmogorov Common Core of said instead
of a probabilistic definition let's use
an algorithmic definition length of the
shortest computer code on the universal
computer that yields the sequence OK
this specifically yields that sequence.
The problem with this is there's no way to
computers because it is the smallest and
there's no way of proving ever that
you have the smallest code for
writing this state OK so I should
also mention this this always tries
to be nuts it's Commodore of
Chaiten no relation to me
this guy over here also came up
with Kolmogorov complexity as it's
called with the idea of the smallest
code he was a Bronx High School
of Science when I was at Stuyvesant OK
crosstown we were rivals and
stuff I end there were knew this guy he
figured out of complexity was sixteen
years old and wrote the programmers in
high school is known as Kolmogorov cheated
complexity these guys
are the guys that essentially
wrote down the most you the most used now.
Compression algorithms and see and
the interesting thing about what they
did is there is computer ball for
any sequence just give with any sequence
any length you want it doesn't have
to be stationary it doesn't have to be
an infinite ensemble stuff like that and
it's computer bull in time the scales
with the length of the sequence
OK Let me give you some idea of
what compression is about the is
this compression nobody knows the same or
but I learned that as a Cub Scout or
something like that is Morse code you in
code your letters you in code whatever you
want to send so used to short a symbol for
the things are cursed most frequently
OK that's variable length coding so
an easy is just one a T.
is one day ash and
things that are used very much like a Z.
are def there stop right there
longer OK that now has become
what's known as Huffman coding
you make a table from you know
sequences you can take two
sequences as long as you want and
the ones that use most frequently see
code small in the smallest region.
This is kind of interesting this is a play
made of the year it turns out to show you
how massaging the computer computer
people were at the start of the field
of computer science what they did
is they said OK we're going to try
in coding this this is going
to be our reference for
how well we can code things this
is part of the picture the rest is
this is a playmate centerfold from Playboy
OK so I'm not showing you the rest but
this is what they use five twelve
pixels by type to use pixels
known as Lena one of the ways that you
do compression is used to predictive
in coding where you say is I'm in
a look at the last couple of pixels and
predict what the next pixel
will be I mean a picture.
Like this even though it looks
fairly complicated most of it is or
a lot of it is just saying the next
pixel is the previous pixel and
instead of having to used to fifty six.
Different numbers or when numbers in
the picture sensually all of them are in
a range of around twenty around
the value of the pixel before so
that's one kind of Equality Now
these guys came up with a level and
see if there are several
forms of one pool and see if.
They come up with several methods of doing
this I will show you one of them here I'll
skip most of that and to show you the way
it's done the way it's done is you look at
a a look up a buffer before
a dictionary that you make before and
a buffer zone ahead and what you do
is you say for instance this is my.
This is my look ahead buffer over
here there's nothing before it so
the first thing I write down as I'm
starting I'm going to write down zero
course that's where there's nothing
before and then I write that in a OK I
move the buffer up I write down
scene a before so I go back one and
I copy one and
the next literacy here I go back
my buffer is now over here I look
forward I see a C N A I look back and
I say go back three from
here go back three copy for
copy for you do cyclically So
you actually copy a A C.
A and that gives you that and
then you put the next letter which is B.
and now you are starting to see
that as things read reoccur and
particularly if they or
something like periodic For
instance if it was now a C A A C A E C
a million times you'd write go back three.
Copy three million right and that's
where your compression is going to coat
come from and then you continue right and
you go back and you find out where your
sequences occurred before he just tell
where how far back to go read it and
then put in how one how much you're
in a copy and the next letter and for
this there are fifteen letters here and
there fifteen symbols here so
this is not going to work well for
really short sequences but for
a long we sequences it can really
pay off OK this is used in
the Flavian ceilidh usually use it
in just zip or G.'s IP or in P.N.
she OK When you say photos so
these guys are for photos P.
and G.
and ship.
OK Now one of the things is there's
a lot of them work the spin done on this
people know how little approach
to Shannon and Sharpie but
I doing them pulls the compression OK And
it turns out to pens
with your sequence is going to go to
something with finite venture pay or
with zero interest zero and should pay for
instance would be a periodic sequence of.
Finite entropy is a range of.
These It converges not so slowly for
a periodic sequence is a la get over N.
but for a random sequence it converges
Vers slowly log log and over
again and it turns out the coefficient
in front of here this is all log log N.
which means you don't really know
what the coefficient in front is
on the other hand you can check
with a quasiperiodic sequence
this is the compression this is
the length of the compressed code per.
Limits of your code periodic
sequences log in over N.
quasiperiodic as log and squared over N.
random is like that and
you should realize by the way
that sometimes the Kolmogorov Complexity
compress is mud.
Much smaller than the Shannon entropy for
instance.
Is affectively a random number there's
no analysis anybody's ever done
on that which tells you that it's other
than a random number on the other
hand you can write a short computer
code which should will generate for
you pipe to as many integers as
you want so it's cut Kolmogorov
complexity is small this shows you OK for
a case think we've looked at what this
extrapolation looks like for a random
sequence which I'll explain to in a minute
we're just going to take a set of.
Boxes and randomly throw particles
in them and calculate what
the compression is and we know for that
problem exactly what the entropy is and
here's a plot of how much how big a.
House a lurch and number of boxes you
have to have or particles in order to get
into the region where you can extrapolate
and here is going like a log log and
you can extrapolate reasonably
well to a couple percent
to what the actual answer is what it looks
like is this if you stop with maybe.
Ten to the fourth ten to the fifth
particle something like that you get this
for the levels the entropy if you
like you can extrapolate using
this log log business OK back to
find out when you get an infinity
here's what you extract to and
this is the exact value so this isn't so
bad if you do an extrapolation You could
also do this another way which for
instance the tell of Reeve group did which
also works really well which is simply to
balance this and normalize it and
that gives you a very good answer as well.
Here's what happens the natural thing
to do is what happens with to the I.C.
right you take to the eye.
Saying if you do the extrapolation with
a number eight over to a Over here in
doing the extrapolation using in a number
a over here which is what you expect for
a completely random sequence you get this
curve which is off by a couple percent
if you just fit one point here to
find to use a different value for
a you get this the so the block
there is the exact to the icing and
Shippey and this is what you get by
fitting so it works pretty good the other
thing we can show is if when you do the
extrapolation you get extrapolated value
which is higher than another extrapolated
value that you using the actual entropy
is is is monotonic that way right
you don't get in the first OK.
Suppose you want to do Not one of these
sequences to these sequences like low over
here you have to figure out how to scan
you can do a wrester scan like that you
can do a rain scan like that it turns out
the best scan that people are found is
a scan Hilbert's scan is
sort of a fractal sation
of this problem so you see you're
more likely to have in a local
region a short distance to go to find the
similar thing rather the rest are scare
on the other hand we take a picture of the
camera most of the time it's arrestor scan
because that's the horizon right
OK It turns out you pay for
this if you buy a Canon
camera OK Anything buy for
under four hundred dollars does not have
lost this compression and anything you buy
above four hundred dollars has their
own loss this compression Tina OK
now everybody knows the G.I.
Taylor experiment right I don't have to
show this movie is that right right OK so
he showed that that.
At moguls them for if you take a fluid.
Is.
What I show it.
It's OK I will show the movie because
it's nice I also see this show
this movie you know if you haven't seen
this movie this is this is a coed cell
to concentric cylinders a very viscous
fluid in the middle of it OK and
you inject thinking they are and you want
it around and what I should tell you is.
If you haven't seen
this movie movie before
it's the only thing you'll
remember from the stock.
And it may be the only thing you
remember from this conference so
you surly in shop he's whirling around is
this appeared he's not doing it you know
this is actually G.I. Taylor's finger OK
he's going around he was in a certain And
now he's going to wind it back now
he's not at the same rate OK he's not
trying to go the same speed or whatever OK
but when the under unwinds at four turns.
It comes back.
Now that is a beautiful.
OK so.
Now you remember if you haven't seen it
before you've seen it all you have to do
is see it once you will remember it
forever is just a great movie OK now.
They fine injury Ghaleb whose
name should be on here isn't
this great experiment really great
experiment to see what happens
if you not just have littles number
fluid but you put particles in there
this is still holding you have particles
in there that when you reverse the flow
you go back by the way I should have said
what that's supposed to show you is that
low rentals number flow is time
reversible in a space in a real specific
sense that is that the motion of the fluid
is slave to the boundary conditions so
all you have to do is reverse
the direction reverse what's going on
with the Ballantrae because that's the
only place forces are in this case OK And
you completely reverse the flow and
it's like playing the movie backwards
OK they want to know what happens if you
put particles in so they did the G.I.
Taylor experiment but with particles
in the air in a cool wet cell and
they're just going to go back and forth OK
and what they found is sort of remarkable
here is what happens if you
this is actually a movie this
one you can tell is a move because things
are moving this is also a movie it's
a strobe OK so you're going back
to the same point in every cycle
as you're oscillating turning it back and
forth OK And the question is if G.I.
Taylor was right just like that plot comes
back well a part of this company should
come back to the same position and
over here they do they sensually when
you strobe it they essentially come back
which White is why this movie looks still
worse above some threshold which
is volume fraction dependent
concentration dependent
this is the amplitude
not the rate the amplitude of the motion
over here if you're above some threshold.
It's chaotic and diffuse.
And below.
That threshold is reversible just
like G.I. Taylor said OK So here's
what it looks like there's essentially
no diffusion the particles come back
every cycle periodic way they come back to
exactly the same place until you get to
the stress shoal and then there's motion
and it's an isotropic in this case but
that doesn't matter now the question
is how the hell do you get that and.
They sort of had a quasi explanation
their quasi explanation was it's known
that if you have no Reynolds number
flow and you have two particles OK and
you share them past one another they
go around one another they end up on
the same stream wall and you come
back they do the same thing OK and
that problem's completely deterministic in
mono on the other hand what source unknown
is the have three particles in
the motion is exponentially sensitive
to the position of the particles
that is it's chaotic OK And so
what their explanation was was
that the bigger displacement
you make the more likely you are to
get three particle interactions.
And that sort of sounded OK the first time
I heard one I thought about I didn't like
it very much because the only thing I
sort of know about random systems is that
random and occasionally have two particles
close together but occasionally will serve
three four holes together or four as many
as you want so it's not a matter of how
far you go OK it's simply how many
you're going to have and that would say
you can't possibly get a threshold from
this or you get this a crossover so
I tried to convince him of this OK and
I sort of gave up.
Because I couldn't get him instead so
I said OK I have to convince you so
I made a simulation in the simulation
because I really don't know how to
do simulations and in the how to do.
Hydra nomics is the simplest simulation
you've ever seen in your life OK It
consists of the following you take you
through particles down the box you put
an African defamation on to represent
the Shia are under that the particles
make lie they actually won't of course
if they're high genomics involved but
since I don't know how to do I can Hydra
nomics I let them collide if they collide
I'll say something happens I don't know
what happens involves friction involves
Hydra nomics but when I bring it back then
I'll say I'll take account of the fact
that they collided by bringing them back
to their original position ology I tell it
but then giving them each a slight
displacement OK so they move
that's the idea and then I'm just going
to repeat this OK so I set this thing up.
And I let it go and
this is what the strobe looks like so
the bridge guys here have made contact and
I move them slightly and
the same with the red and the blue
guys haven't contacted in things so
I don't move them such every time I
strobe it I didn't do anything and
it sort of looks like things
over here are moving but
they're sort of going to get
a confected overall in a while I think.
If I follow that eventually
everything here is moving right but
you'll notice this is a ball
of a critical strain.
This is below a critical strain is
above saying the activity here.
Was dying is a function of time.
And here it stays on and
it never stops so.
Since I'm mostly since
I'm an experimentalist
the first thing I did is they
hit the computer like that.
And it didn't start didn't restart OK so
that's good so then I ran it again and
it did the same thing because I'm not sure
on the how to write computer programs and
it's the same thing and as soon as that
happened I said off I don't know what
happened since there are colliding in
there moving around OK they explore
new configurations and if Explorer
configurations eventually they may find
a configuration where none of
the particles is colliding the others and
then the problem just stops it just stops
because there's no more activity right.
So that's kind of cool cause I didn't
know the systems could do that OK.
OK So the question and
the neat thing about it is in
the simulations we do this above threshold
what you find is there is an activity
this is the number of active particles
precise call it starts out high and
then it goes to a steady state and if you
below the threshold has a characteristic
time and beyond that time it eventually
just stops in this new activity and
now if you plot that time as
a function of the amplitude of
the strain you find that it diverges
on both sides of their transition
which means that this is a second order
phase transition it's not a third
of that hammock face transition it's
a dynamic face transition so then.
With Dave we went after that why have
we didn't really go back to the lab
with similar on back to the way
I did all the experiments and
he looked at this in the real difference
between the idea that has to do with
chaos and stuff where that has to do
with this organization which we called
random organization because we didn't
know what the organization was
is whether it depends on time or
you get whether you get it immediately so
it develops after a time the things
are all moving and then they come to rest
that's when the Morgan ization if it
happens from the start it's maybe chaos or
something like that Here's essentially
what happens this is a lot of our plot of
the activity for a different strain for
different amplitudes of the sheer And
you can see there's a time here
that it takes in order to stop or
whatever and you can measure that and
indeed what you find in the experiment is
divergence of both sides of the transition
so that seems to be the answer now
to me this came all sorts of
questions that I just didn't
understand didn't know how to answer
at all like why in the world does this
happen in this particular threshold strain
it's not the geometric place that you'd
want geometric you could always
move things around until you got to
some packing density.
And then he just reduced set
a little you can move it more and
this was nothing like the packing and
said the other question is when it's in
the active phase over here is it regarding
this is sample all states what's going
on right it obviously doesn't
the sample all states in the.
In the absorbing region.
OK So it turns out unbeknown to me
there are lots of absorbing state
models that came before we did for
in the organization Here's a simple one
this is called conserve lattice chaos and
this isn't one dimension so
it's really easy to conceive of
how you take the sequence and
you compress it questions just
the string it's a string.
Zeros and ones the ones are occupied
sites in the zeros are occupied sites
in conserved a lot of scarce you throw the
particles down the particles down randomly
and if there's a neighbor you say a sacked
of and if there's a neighbor you move it
right you take an active say the next step
you move it to a knock on occupied site
and so this is going to go until it finds
a state like this where there's no more
activity that's sort of the simplest
model now you do this OK
you say again what we're going to do now
is the compression to look at this so
you do as you generate the random
states you compress them you find
the length of the file or
you do the extrapolation either way and
what you find is a curve that
looks like this OK and now you
let the dynamics happen at the end of
the day what you find out is anything less
than a half the critical point here is
the density of a half which is unusual cos
that's actually the geometrical limit here
and what you find is when you compress
the absorbing States when you look
at the absorbing states you get
the configurations you find
this blue curve over here.
And then you take a month to Carlo
calculation that you can do and
you find all of the states we
have no nearest neighbors and
you compare that to right so
if the system were godlike or
something like it you would find
all of the absorbing States and
it turns out comparing these two if you
like the information here is less or
the entropy years less which means
it hasn't found all the states
which means to some sense in some
sense it must be more organized or
more ordered than if I just
threw them down randomly.
This one.
No even for
this old model conserve flatus gas but
once you find this you can go and
you can look at what happens and
sure enough what you find which wasn't
long before we did this simple calculation
is when you look from the dynamics of
the correlation function you find out
that the states that you find at the same
density for the conserved lattice chaos.
This these are for different densities
this is the correlation function
the correlation length grows much much
larger than it does if you just exclude
neighbors so you do find a much
shorter word say here why it turns
out in conserved lot of scarce when you
start out with clusters they spread and
that spreading gives you longer range
correlations So already we've discovered
something here that we didn't know
simply by doing the compression for
an active system here's an example
I showed you before which is more
interesting system courses and
two dimensions which is the man a model
OK And again what you do is you
throw down particles in the two
dimensional law to see here they're
active if you have more than one person
you move them to neighboring sites and
you keep on doing this until you have no
double occupancy Here's what it looks like
it doesn't look to that too dissimilar
from the random organization model
I showed you this will of all of.
And stop eventually if it's
sort of Reince facts and
then whatever but
this will eventually stop.
The OK And what that looks like is the
following Here are the initial states to
get you through things down randomly after
you let are valid to ten to the six steps
OK you find the black are now the black
curve is mostly hidden over here
by this red curve which is all
the absorbing space again that.
Just find all the possible configurations
OK where I don't openly occupy a site and
now if I blow this up in this
region the states that are found by
doing this compression algorithm or
layer or so again it's more organized.
We don't know why that is yet
but it's more organized OK
here's something else that's kind of
interesting over here it's active so
anything from here up here is by the way
if you plot the activity OK over here.
It's not the you have to have those
orbit says during spaces overstays and
then that this critical value you start
to get activity just like in there
in the morning the station and
of course these guys match right on
without believe they just match
right now what I would have expected
is when you're in the active phase it's
going to be organic as you get more and
more activity it's going to sample more
and more of the space suit expect that
this guy should convert conversion to
what your initial random states were
that would tell you that you're a god and
it doesn't seem to be doing that so
again this is something that we looked
at him said this is funky really
expects this maybe it's true but why OK so
then it turns out we go in the book
Stefano had been doing these simulations
and he'd been doing them with the power
level update OK So
he finds all of the active States and
in one move he moves all
of them right OK I would
have done that that way not very good at
computing I would have just taken one
of the sack of moved it and
then look for another one and
move there are ten cents a minute that
turns out when you do parallel updates
swears this what happens is.
Checkerboard.
OK And it's like a spinor movie
compositional machine there are two phases
here they're out of phase one another but
it's going to in order to
face is going back and
forth between these two word phrase
nobody will study man the models for
the past ten fifteen twenty years however
long they've been around has ever found
anything like this before but
you just do the compression and
you find discover stuff that
you haven't seen before and
once you get rid of that OK
this curve goes through that
curve once you do random updates you
find out that it is go in your gob and
you've got a quantitative way of
seeing that it's going to go OK.
Something else if you look instead
of the whole curve here as it's
progressing in time if you take
a particular density and you follow
it as a function of time you find that
it goes through the absorbing state or
goes to an active state OK you can't
tell if Zorin are active from this or
you can the tell is that
the information OK or
the length of the data file is going
down versus time you can look at that.
And evaluate what it looks like
as a function of time in the same
way you would look at an order parameter
so if you want to know how things
scale with time you get a divergence
OK in either side amount of time.
OK.
OK good so if you want to do that you
can look at it as almost like a similar
parameter and do the scaling is going
to be won over time to a power and
then the sex potential which gives
you the correlation time and
this is a comparison of what you
get if you look at the activity or
if you just look at the information this
is below the transition this is above
the transitions the just right on so you
can get critical exponents here with the.
Even knowing what the order parameter
is for the system just by looking at
the information and they say well what
if I choose a different algorithm for
my compression answer you
get a different answer but
it's not a very different answer and
it shows you essentially the same physics
OK this is random organization
again you see the change
here in entropy you see that the states
the random organization states why below
all the absorbing states are running at a
time so I won't go through this very much
again you can try different algorithms and
here they lie on top of another and
again you can do the time scale me you can
look not just at absorbing systems these
are swimmers this is a model OK from.
Christina.
Mock headache OK of particles which
in don't interact with each other
unless they get within the diameter of one
another in which case they repel with K.
harmonically otherwise their swimming and
their motion changes they have
wrecked Brownian the fusion in the OR
in the ant in their
annular dependence OK and
if you look at that system OK as a
function of density you put in the initial
configuration it looks like this OK and
now you let them start swimming and
when you find they see it's better doing
it here what you find this if there's
a low density even though they're
swimming that information doesn't change
the configurations they find do the same
until you get to a certain density and
when you get to that certain density what
you find is at that density they're going
they're going their information isn't
changing and then it changes and
goes down OK and you look at
the same pole and that's true for
every value above here and you look
at what your images look like now and
you find that over here
there are sentient.
Uniformly distributed and
over here they claim and
so what you find here is another active
system and mind you I should say here this
also is not on the last where the more
musician is also not on the last so
you have to figure out how to describe
ties this which you can do here you see
a dynamic first quarter
face transition right and
you can tell by the jump rather than just
the costs OK here we change the value and
you find the cost of the transitions
into for place and you can compare that
with Christina found by looking because
there really isn't an order parameter for
the system but other things that
would identify when you get face
separation things like this but here
you find this transition by compression
without knowing what you
were primitives at all.
You can do this for an experimental system
that's we're doing now here are some
swimmers we have if you trace them as
a function of time when they're swimming
These are light activated so you can turn
a light and you can see the information or
the compressed file going down we haven't
analyzed this for lots of different that.
Densities or speeds yet but when you turn
off the light you can see the information
go up so we have to do now for an
experimental system with Christina did for
the model system you can look at
regular face transitions as well OK.
Thermal thermal dynamics face
transitions where you normally
would a parameter of stuff for
instance the cells help and
also a young male thing which
is defect mediated OK And
what happens in that case is you go from
having lots of the facts to few defects
in terms of the inverse temperature
of the strength of the interaction.
This transition happens very quickly
you can do the compression of that
the red curve here is just taking.
A picture of the structure and
compressing it OK.
And you find that you get a roll off
that looks like that we couldn't find
a good reference for what the entropy of
the system is but you can calculate it but
one minus the order parameter in this case
square to something like the entropy and
that's what this red curve is and
the other curves are where you get from
compression so even for thermodynamics
face transition where you know what
the water pressure is you can find it and
if you don't know the order parameter
you can still find the south.
They have ways who've left told us
if we thought this was important
we should name it OK and we can call it
computer Bill information then city but
he said you should pick out a good
name for right so it turns out Prince.
Changed his name he became a symbol and
he was that became
The Artist Formerly Known as Prince so
we thought that sounds like a really
good idea will do is take a symbol and
call this a quantity formerly known
as entropy OK Anyway I'm done so
I just want to tell you that I think this
is a new way of looking at things if
you've got a system where you want to know
whether it's organizing you she just stood
there this is the city OK depending
how much information you want out
of it it's harder OK but the lowest order
it's easy and it's a good way of finding
out with your systems order or not even if
you don't know what the ordering is and
we're going to do next is look at
classes in correlation lengths
We'll look at the sun Sky Survey
which has redshift information so
we can do the entropy of
the universe as a function of time.
Maybe we're going to do some
memory with said I hope.
Something like that thanks.