30.10.06

The New Science of Networks Review


Taken from here. Please read, its very well written and easily understood. I am so sorry about the cheesy pic btw, but it was the only one i could see and easily steal from the website. :(

I will not write this with full reference (mostly because I hate the touch pad on this laptop and it would take ages for me to keep swapping and changing for names) but all the things I make comment on (unless specified) are from the works in the above link.

-----

The initial idea that holds this together is a theory that everyone is, in some way, connected to everyone else in the work. In effect we are each 'nodes' and we are 'linked' to each other across social networks. The case which is used in the website notes that this is done in varying degrees of separation that can vary from the most connected that can perhaps have 3-links of separation from someone on the other side of the world, to people with little or no like with someone else.

It seems to me that this is something beyond obvious, that such things are bound to happen but I guess it needs to be qualified by scientific/sociological study and as such a result can be built upon as things generally do later with the connection to getting jobs. Here it is seen that you are more likely to get a job through social networks that have weaker connections, with the qualification that generally your best friends are those who have similar interest as you and will be least in need of your services. An interesting twist on this would be to look at how many people are 'connected' as a result of a strong connection. Ie. a 'friend of a friend' needing help which would not have any connection to each other otherwise... Thus in this the review underestimates the possible need for stronger acquaintances in the facilitation of connection to the weaker connections.

Now the only 'difficult' part of the theories... At least in my head is the '80% 20%' rule. Though I guess this may simply be a matter of sorting the ideas in my head which tends towards the seeking of exceptions and not the acceptance of generalisations. It is probably best to quote the examples given and then give my spin on things...


  • "20% of landowners own 80% of the land.
  • 20% of workers do 80% of the work.
  • 20% of salespeople make 80% of sales.
  • 20% of criminals carry out 80% of crime.
  • 20% of websites get 80% of the traffic.
  • 20% of the customers create 80% of the calls to techsupport."



(Ok things messed up with the formatting there, never mind, im too lazy to sort it out)

Put down with its constituent components it seems that in any action there are more and less active components... I.e. Links and Nodes. And with those processes the result is a result of 20% of nodes create 80% of the links. Maybe the figures are not to be taken at face value here, it seems I need to look at the research in more detail if i am to understand it better... But we shall see, for those that are interested themselves it was an economics theory created in the early 1900s by Pareto, an Italian economist.

It is time for another quote here, I am not mathematician so its probably easier if I do it this way. But here is and explanation of understanding for the degrees of stratification in the strength of the node-linkages.
"Logarithmic Distribution: Instead of random distribution or bell curve distributions, the distribution of links in a network is determined by logarithmic power laws. If you remember log tables from math, log numbers increase by powers of ten. 2 is ten times larger than 1, 3 is 100 times larger than 1, and so on. This means some nodes have all the links and most nodes only have a few links.
Earthquakes are measured by log numbers: A magnitude 2.0 is ten times more severe than a magnitude 1.0, a 3.0 is 100 times stronger, and so on."
Here reference is made to Google's grading of websites for its search engine. Google uses a grading system that separates websites the a magnitude of ten as well... I guess this is simply used as a way of quantifying the previous and later theories in peoples heads.

Moving on to something more juicy. Is the idea of Big Nodes growing faster as they need to do less to create links to other nodes simply by the fact that they have lots of nodes already linking to them. So, a larger website will have more smaller websites making reference to them and the net result is that a web is created that directs you're Internet activities to that single big site (if it is on an interest to you). To the point where (Assuming the content is informative, and relevant) you will actively engage, make reference to, and create a mode-link to that site yourself without the need for actual aggressive 'sociability' buy that website-node.

I guess that this is a point of 'phase transition', even though the website makes reference to something else when it says this. In that it is a tipping point for the sum of links that are being created independently of any attempt by the big node-website to increase linkages to the amount of linkages which are being degraded (e.g. Via time, in ways of inactivity). What it actually makes reference to is the point where the nodes all become "a single entity" and adopt standards, generalisations and norms. This happens along the 80% 20% rule, in that the big node(s) will be the 20% doing the 80% of the linkages. And it will be this acceptance of the norms that will allow users/smaller nodes to be lazy in their activities allowing for easy linkage...

An interesting topic that comes up here(in my own thoughts) is how there is resistance in the ability of nodes to link. This is in the action of connectivity, that demands user interaction and effort. Thus some nodes are more easily accessed not only on a level that they are more widely available but also that they are more 'smoothly' created. For example a website that needs 'registration' will be a barrier, or a resisting force to the creation of social nodal linkages. But are all linkages desirable? The review website says yes, in that even the most minor of connections could come out and result in a positive outcome. But in that the actuality is that some linkages have a negative drain on resources and time. The simple example of this is the way in which some websites create barriers so strong against easy access (Payment, and detailed stringent rules needed for forums) it becomes desirable to join such entity at the 'chaff' is automatically excluded from the Node.

-----
This needs more work, But for now this is where I will leave it and may come back later and finish it up.

No comments: