Today I finished reading “The Feynman Lectures on Physics Vol 13: On Fields” by Richard Feynman
Paper – Opinion formation and cyclic dominance in adaptive networks
Today I read a paper titled “Opinion formation and cyclic dominance in adaptive networks”
The abstract is:
The Rock-Paper-Scissors(RPS) game is a paradigmatic model for cyclic dominance in biological systems.
Here we consider this game in the social context of competition between opinions in a networked society.
In our model, every agent has an opinion which is drawn from the three choices: rock, paper or scissors.
In every timestep a link is selected randomly and the game is played between the nodes connected by the link.
The loser either adopts the opinion of the winner or rewires the link.
These rules define an adaptive network on which the agent’s opinions coevolve with the network topology of social contacts.
We show analytically and numerically that nonequilibrium phase transitions occur as a function of the rewiring strength.
The transitions separate four distinct phases which differ in the observed dynamics of opinions and topology.
In particular, there is one phase where the population settles to an arbitrary consensus opinion.
We present a detailed analysis of the corresponding transitions revealing an apparently paradoxial behavior.
The system approaches consensus states where they are unstable, whereas other dynamics prevail when the consensus states are stable.
Paper – Quantum Interaction Approach in Cognition, Artificial Intelligence and Robotics
Today I read a paper titled “Quantum Interaction Approach in Cognition, Artificial Intelligence and Robotics”
The abstract is:
The mathematical formalism of quantum mechanics has been successfully employed in the last years to model situations in which the use of classical structures gives rise to problematical situations, and where typically quantum effects, such as ‘contextuality’ and ‘entanglement’, have been recognized.
This ‘Quantum Interaction Approach’ is briefly reviewed in this paper focusing, in particular, on the quantum models that have been elaborated to describe how concepts combine in cognitive science, and on the ensuing identification of a quantum structure in human thought.
We point out that these results provide interesting insights toward the development of a unified theory for meaning and knowledge formalization and representation.
Then, we analyze the technological aspects and implications of our approach, and a particular attention is devoted to the connections with symbolic artificial intelligence, quantum computation and robotics.
Studying – Creating poseable characters with Illustrator
This month I am studying “Creating poseable characters with Illustrator”
I seem to be on an Adobe Illustrator kick, of late. Forth class in a row dealing with interesting aspects of how to use the software.
Log: 20 hours of practice
Listening – Year Of The Black Rainbow
This week I am listening to “Year Of The Black Rainbow” by Coheed and Cambria
Paper – Using explosive percolation in analysis of real-world networks
Today I read a paper titled “Using explosive percolation in analysis of real-world networks”
The abstract is:
We apply a variant of the explosive percolation procedure to large real-world networks, and show with finite-size scaling that the university class, ordinary or explosive, of the resulting percolation transition depends on the structural properties of the network as well as the number of unoccupied links considered for comparison in our procedure.
We observe that in our social networks, the percolation clusters close to the critical point are related to the community structure.
This relationship is further highlighted by applying the procedure to model networks with pre-defined communities.
Read – Conan #0: Born on the Battlefield
Today I finished reading “Conan #0: Born on the Battlefield” by Kurt Busiek
Read – Rework
Today I finished reading “Rework” by Jason Fried
Paper – Dimensionality Reduction and Reconstruction using Mirroring Neural Networks and Object Recognition based on Reduced Dimension Characteristic Vector
Today I read a paper titled “Dimensionality Reduction and Reconstruction using Mirroring Neural Networks and Object Recognition based on Reduced Dimension Characteristic Vector”
The abstract is:
In this paper, we present a Mirroring Neural Network architecture to perform non-linear dimensionality reduction and Object Recognition using a reduced lowdimensional characteristic vector.
In addition to dimensionality reduction, the network also reconstructs (mirrors) the original high-dimensional input vector from the reduced low-dimensional data.
The Mirroring Neural Network architecture has more number of processing elements (adalines) in the outer layers and the least number of elements in the central layer to form a converging-diverging shape in its configuration.
Since this network is able to reconstruct the original image from the output of the innermost layer (which contains all the information about the input pattern), these outputs can be used as object signature to classify patterns.
The network is trained to minimize the discrepancy between actual output and the input by back propagating the mean squared error from the output layer to the input layer.
After successfully training the network, it can reduce the dimension of input vectors and mirror the patterns fed to it.
The Mirroring Neural Network architecture gave very good results on various test patterns.
Paper – L2-optimal image interpolation and its applications to medical imaging
Today I read a paper titled “L2-optimal image interpolation and its applications to medical imaging”
The abstract is:
Digital medical images are always displayed scaled to fit particular view.
Interpolation is responsible for this scaling, and if not done properly, can significantly degrade diagnostic image quality.
However, theoretically-optimal interpolation algorithms may also be the most time-consuming and impractical.
We propose a new approach, adapted to the needs of digital medical imaging, to combine high interpolation speed and superior L2-optimal image quality.
Paper – Music By Numbers
Today I read a paper titled “Music By Numbers”
The abstract is:
In this paper we present a mathematical way of defining musical modes, we derive a formula for the total number of modes and define the musicality of a mode as the total number of harmonic chords whithin the mode.
We also give an algorithm for the construction of a duet of melodic lines given a sequence of numbers and a mode.
We attach the .mus files of the counterpoints obtained by using the sequence of primes and several musical modes.
Listening – Black Sands
This week I am listening to “Black Sands” by Bonobo
Read – C# Game Programming: For Serious Game Creation
Today I finished reading “C# Game Programming: For Serious Game Creation” by Daniel Schuller
Paper – Doubly Robust Policy Evaluation and Learning
Today I read a paper titled “Doubly Robust Policy Evaluation and Learning”
The abstract is:
We study decision making in environments where the reward is only partially observed, but can be modeled as a function of an action and an observed context.
This setting, known as contextual bandits, encompasses a wide variety of applications including health-care policy and Internet advertising.
A central task is evaluation of a new policy given historic data consisting of contexts, actions and received rewards.
The key challenge is that the past data typically does not faithfully represent proportions of actions taken by a new policy.
Previous approaches rely either on models of rewards or models of the past policy.
The former are plagued by a large bias whereas the latter have a large variance.
In this work, we leverage the strength and overcome the weaknesses of the two approaches by applying the doubly robust technique to the problems of policy evaluation and optimization.
We prove that this approach yields accurate value estimates when we have either a good (but not necessarily consistent) model of rewards or a good (but not necessarily consistent) model of past policy.
Extensive empirical comparison demonstrates that the doubly robust approach uniformly improves over existing techniques, achieving both lower variance in value estimation and better policies.
As such, we expect the doubly robust approach to become common practice.
Listening – I, Vigilante
This week I am listening to “I, Vigilante” by Crippled Black Phoenix
Read – Ragtime
Today I finished reading “Ragtime” by E.L. Doctorow
Read – Conan: The Spear and Other Stories
Today I finished reading “Conan: The Spear and Other Stories” by Timothy Truman
Paper – Topological Quantum Error Correction with Optimal Encoding Rate
Today I read a paper titled “Topological Quantum Error Correction with Optimal Encoding Rate”
The abstract is:
We prove the existence of topological quantum error correcting codes with encoding rates $k/n$ asymptotically approaching the maximum possible value.
Explicit constructions of these topological codes are presented using surfaces of arbitrary genus.
We find a class of regular toric codes that are optimal.
For physical implementations, we present planar topological codes.
Listening – The Wild Hunt
This week I am listening to “The Wild Hunt” by The Tallest Man On Earth
Paper – Improving Human-Computer Interaction by Developing Culture-sensitive Applications based on Common Sense Knowledge
Today I read a paper titled “Improving Human-Computer Interaction by Developing Culture-sensitive Applications based on Common Sense Knowledge”
The abstract is:
The advent of Web 3.0, claiming for personalization in interactive systems (Lassila & Hendler, 2007), and the need for systems capable of interacting in a more natural way in the future society flooded with computer systems and devices (Harper et al., 2008) show that great advances in HCI should be done.
This chapter presents some contributions of LIA for the future of HCI, defending that using common sense knowledge is a possibility for improving HCI, especially because people assign meaning to their messages based on their common sense and, therefore, the use of this knowledge in developing user interfaces can make them more intuitive to the end-user.
Moreover, as common sense knowledge varies from group to group of people, it can be used for developing applications capable of giving different feedback for different target groups, as the applications presented along this chapter illustrate, allowing, in this way, interface personalization taking into account cultural issues.
For the purpose of using common sense knowledge in the development and design of computer systems, it is necessary to provide an architecture that allows it.
This chapter presents LIAs approaches for common sense knowledge acquisition, representation and use, as well as for natural language processing, contributing with those ones who intent to get into this challenging world to get started.
Read – The Annotated Turing
Today I finished reading “The Annotated Turing: A Guided Tour Through Alan Turing’s Historic Paper on Computability and the Turing Machine” by Charles Petzold
Listening – Lisbon
This week I am listening to “Lisbon” by The Walkmen
Read – Agile Coaching
Today I finished reading “Agile Coaching” by Rachel Davies
Read – Perfect Phrases for Presenting Business Strategies
Today I finished reading “Perfect Phrases for Presenting Business Strategies” by Don Debelak
Paper – User Experience, Software Interfaces, and The Unconscious
Today I read a paper titled “User Experience, Software Interfaces, and The Unconscious”
The abstract is:
Ideas about how to increase the unconscious participation in interaction between ‘a human’ and ‘a computer’ are developed in this paper.
Evidence of impact of the unconscious functioning is presented.
The unconscious is characterised as being a responsive, contextual, and autonomous participant of human-computer interaction.
The unconscious participation occurs independently of one’s cognitive and educational levels and, if ignored, leads to learning inefficiencies and compulsive behaviours, illustrations of which are provided.
Three practical approaches to a study of subjective user experience are outlined as follows: (a) tracing operant conditioning effects of software, (b) registering signs of brain activity psychological or information processing meaning of which is well-explored and (c) exploring submodality interfaces.
Implications for improvement of current usability study methods, such as eye-tracking, are generally considered.
Conclusions consider advantages and disadvantages of unconscious-embracing design and remind about a loss of human evolutionary choices if unconscious participation is ignored, complicated or blocked in interaction with computer interfaces and built environment.
Paper – Simulated annealing for weighted polygon packing
Today I read a paper titled “Simulated annealing for weighted polygon packing”
The abstract is:
In this paper we present a new algorithm for a layout optimization problem: this concerns the placement of weighted polygons inside a circular container, the two objectives being to minimize imbalance of mass and to minimize the radius of the container.
This problem carries real practical significance in industrial applications (such as the design of satellites), as well as being of significant theoretical interest.
Previous work has dealt with circular or rectangular objects, but here we deal with the more realistic case where objects may be represented as polygons and the polygons are allowed to rotate.
We present a solution based on simulated annealing and first test it on instances with known optima.
Our results show that the algorithm obtains container radii that are close to optimal.
We also compare our method with existing algorithms for the (special) rectangular case.
Experimental results show that our approach out-performs these methods in terms of solution quality.
Paper – Cybermatter
Today I read a paper titled “Cybermatter”
The abstract is:
In this paper we examine several aspects of the impact of Cyberworld onto our Reality conceptions, and their social implications.
Studying – Cartography illustrations
This month I am studying “Cartography illustrations”
Had a lot of fun creating a bunch of maps in Illustrator last month so I decided to actually take a class in cartograpy for this month.
This is an in-person class of about 8 hours split over two days.
Update: Logged 11 hours total between in-person class time and extra practice.
Listening – Man On The Moon II: The Legend Of Mr. Rager
This week I am listening to “Man On The Moon II: The Legend Of Mr. Rager” by Kid Cudi
Read – Foundations on Natural and Artificial Computation
Today I finished reading “Foundations on Natural and Artificial Computation: 4th International Work-Conference on the Interplay Between Natural and Artificial Computation, May 30-June 3, 2011, Proceedings, Part I” by Jose Mira
Paper – Clique Graphs and Overlapping Communities
Today I read a paper titled “Clique Graphs and Overlapping Communities”
The abstract is:
It is shown how to construct a clique graph in which properties of cliques of a fixed order in a given graph are represented by vertices in a weighted graph.
Various definitions and motivations for these weights are given.
The detection of communities or clusters is used to illustrate how a clique graph may be exploited.
In particular a benchmark network is shown where clique graphs find the overlapping communities accurately while vertex partition methods fail.
Paper – Self-organization in social tagging systems
Today I read a paper titled “Self-organization in social tagging systems”
The abstract is:
Individuals often imitate each other to fall into the typical group, leading to a self-organized state of typical behaviors in a community.
In this paper, we model self-organization in social tagging systems and illustrate the underlying interaction and dynamics.
Specifically, we introduce a model in which individuals adjust their own tagging tendency to imitate the average tagging tendency.
We found that when users are of low confidence, they tend to imitate others and lead to a self-organized state with active tagging.
On the other hand, when users are of high confidence and are stubborn for changes, tagging becomes inactive.
We observe a phase transition at a critical level of user confidence when the system changes from one regime to the other.
The distributions of post length obtained from the model are compared to real data which show good agreements.
Read – The Demolished Man
Today I finished reading “The Demolished Man” by Alfred Bester
Listening – Beach Fossils
This week I am listening to “Beach Fossils” by Beach Fossils
Paper – Navigation in non-uniform density social networks
Today I read a paper titled “Navigation in non-uniform density social networks”
The abstract is:
Recent empirical investigations suggest a universal scaling law for the spatial structure of social networks.
It is found that the probability density distribution of an individual to have a friend at distance $d$ scales as $P(d)\propto d^{-1}$.
Since population density is non-uniform in real social networks, a scale invariant friendship network(SIFN) based on the above empirical law is introduced to capture this phenomenon.
We prove the time complexity of navigation in 2-dimensional SIFN is at most $O(\log^4 n)$.
In the real searching experiment, individuals often resort to extra information besides geography location.
Thus, real-world searching process may be seen as a projection of navigation in a $k$-dimensional SIFN($k>2$).
Therefore, we also discuss the relationship between high and low dimensional SIFN.
Particularly, we prove a 2-dimensional SIFN is the projection of a 3-dimensional SIFN.
As a matter of fact, this result can also be generated to any $k$-dimensional SIFN.
Read – City of Golden Shadow
Today I finished reading “City of Golden Shadow ” by Tad Williams
Paper – Agent-based Social Psychology: from Neurocognitive Processes to Social Data
Today I read a paper titled “Agent-based Social Psychology: from Neurocognitive Processes to Social Data”
The abstract is:
Moral Foundation Theory states that groups of different observers may rely on partially dissimilar sets of moral foundations, thereby reaching different moral valuations.
The use of functional imaging techniques has revealed a spectrum of cognitive styles with respect to the differential handling of novel or corroborating information that is correlated to political affiliation.
Here we characterize the collective behavior of an agent-based model whose inter individual interactions due to information exchange in the form of opinions are in qualitative agreement with experimental neuroscience data.
The main conclusion derived connects the existence of diversity in the cognitive strategies and statistics of the sets of moral foundations and suggests that this connection arises from interactions between agents.
Thus a simple interacting agent model, whose interactions are in accord with empirical data on conformity and learning processes, presents statistical signatures consistent with moral judgment patterns of conservatives and liberals as obtained by survey studies of social psychology.
Read – Climbing Mount Improbable
Today I finished reading “Climbing Mount Improbable” by Richard Dawkins
Paper – Incremental and Transitive Discrete Rotations
Today I read a paper titled “Incremental and Transitive Discrete Rotations”
The abstract is:
A discrete rotation algorithm can be apprehended as a parametric application $f\_\alpha$ from $\ZZ[i]$ to $\ZZ[i]$, whose resulting permutation “looks like” the map induced by an Euclidean rotation.
For this kind of algorithm, to be incremental means to compute successively all the intermediate rotate d copies of an image for angles in-between 0 and a destination angle.
The di scretized rotation consists in the composition of an Euclidean rotation with a discretization; the aim of this article is to describe an algorithm whic h computes incrementally a discretized rotation.
The suggested method uses o nly integer arithmetic and does not compute any sine nor any cosine.
More pr ecisely, its design relies on the analysis of the discretized rotation as a step function: the precise description of the discontinuities turns to be th e key ingredient that will make the resulting procedure optimally fast and e xact.
A complete description of the incremental rotation process is provided, also this result may be useful in the specification of a consistent set of defin itions for discrete geometry.
Read – Guerrilla Social Media Marketing
Today I finished reading “Guerrilla Social Media Marketing: 100+ Weapons to Grow Your Online Influence, Attract Customers, and Drive Profits” by Jay Conrad Levinson
Listening – Plastic Beach
This week I am listening to “Plastic Beach” by Gorillaz
Read – Perfect Phrases for the Sales Call
Today I finished reading “Perfect Phrases for the Sales Call: Hundreds of Ready-To-Use Phrases for Persuading Customers to Buy Any Product or Service” by Jeb Brooks
Read – Agents For Games And Simulations II
Today I finished reading “Agents For Games And Simulations II: Trends In Techniques, Concepts And Design” by Frank Dignum
Listening – Écailles De Lune
This week I am listening to “Écailles De Lune” by Alcest
Paper – Modular Adaptive System Based on a Multi-Stage Neural Structure for Recognition of 2D Objects of Discontinuous Production
Today I read a paper titled “Modular Adaptive System Based on a Multi-Stage Neural Structure for Recognition of 2D Objects of Discontinuous Production”
The abstract is:
This is a presentation of a new system for invariant recognition of 2D objects with overlapping classes, that can not be effectively recognized with the traditional methods.
The translation, scale and partial rotation invariant contour object description is transformed in a DCT spectrum space.
The obtained frequency spectrums are decomposed into frequency bands in order to feed different BPG neural nets (NNs).
The NNs are structured in three stages – filtering and full rotation invariance; partial recognition; general classification.
The designed multi-stage BPG Neural Structure shows very good accuracy and flexibility when tested with 2D objects used in the discontinuous production.
The reached speed and the opportunuty for an easy restructuring and reprogramming of the system makes it suitable for application in different applied systems for real time work.
Adwords, VentuRocket and low-ball offers
Couple of days ago I came across a blog post about VentuRocket, an alternative way to find a job. This is a neat little startup that is hoping to change how people find jobs and how companies find good employees.
It works very much like Google AdWords, you specify how much you are willing to pay for a particular keyword that advertises your skills to the world, or at least, the world of companies looking for new people.
I thought I would give it a whirl, nothing ventured, nothing gained.
Who knows, maybe I will find some cool start-up that is willing to make me an offer I cannot refuse.
Sign-up was easy, and within minutes I’m listing out some of my choicest skills and how much I am willing to pay, AdWords-style, to get those keywords in front of someone who is hiring.
Now I consider myself a world class software developer — who doesn’t consider themselves that, right? – I have 30+ years of commercial product experience, over three-quarters of a billion dollars of shipped entertainment products and commercial websites.
I’m a serial entrepreneur, I’ve personally started five companies, two of which I’ve sold, the others, well, not everyone rolls a natural 20 every time.
I’ve worked for Fortune 50 companies and little two man start-ups.
I’ve lead technical teams of a hundred plus people and managed multiple multi-million dollar projects. I’ve been lowly grunt programmer, project manager, director of business development, director of development, lead engineer, chief technical officer, chief executive officer, managing director and also chief bottle washer.
I’m published.
I’ve got multiple awards.
I teach at colleges.
I speak publicly.
That’s my resume and work experience in a nutshell.
Now VentuRocket is a neat concept, you put down money of how much you think your skills are worth. You get contacted by employers who are willing to spend the same amount of dollars to talk to you.
As they claim on the VentuRocket website, “no resumes!” Awesome! Pretty much every job I’ve ever had that I thought was worth a damn didn’t involve the usual resume submission process.
Now imagine my surprise when I get contacted a few hours later by an interested company, “Holy Hell! This thing actually works!”
A little less than $20 spent and someone is wanting to talk about a job and it hasn’t even been 24 hours since I created my VentuRocket account. I even got to talk to the main decision maker at the hiring company a few hours later after the initial contact.
Awesome and fast!
On VentuRocket’s FAQ they address the question of “What if I am not interested in working for the company?” with the answer “It’s great you can be that picky.”
Unfortunately, I and many other people are going to be that picky but not for the reason VentuRocket thinks.
It just cost me $7 to find out that someone is trying to hire me, and is willing to pay the glorious sum of $30/hr.
Well, see, that’s the problem right there.
I cannot afford to work for $30/hr.
I wouldn’t even consider it.
Why would I work for $30/hr?
That yearly sum won’t even cover my monthly expenditures.
I think I can state with some certainty that this company, which will rename nameless, they meant well after all, will not be the last company to offer such a low rate of pay either. Any look at Cyber Coder or Monster or Craigslist will tell you that.
The fundamental flaw is that it will cost job hunters real money to be offered jobs that are not even in their ballpark price range because there is always someone out there willing to try a low-ball to see whether someone desperate enough will take it.
This is the fundamental flaw in the VentuRocket plan.
I don’t think VentuRocket can fix it with a “job must offer this minimum amount of income before I am willing to talk to the company” either because I have been through actual in-person interviews where they know up front what I am seeking as an hourly compensation, the person on the other side of the desk is eager to hire, but when it comes down to negotiation, the offer opens at less than a quarter of what I am seeking and doesn’t move up very much from there.
I honk for ducks!
Automated cars with no human controls are a complete non-starter where my future mother-in-law is concerned.
Take away the indicator, the horn and the ability to flash the high beams at other drivers and she will have nothing to do whilst I’m driving her around.
Studying – Creating maps in Illustrator
This month I am studying “Creating maps in Illustrator”
I love maps. I am looking forward to this class on how to make them in Illustrator.
This seems to be a hybrid class. It has an in-person one-on-one workshop component and then a bunch of video tutorials and exercises to work through. Should be interesting to see how the format plays out.
Update: 27 hours logged between the half-day workshop, the video tutorials, the exercises and the extra work I did.
Read – Betrayer of Worlds
Today I finished reading “Betrayer of Worlds” by Larry Niven
Paper – Relativity as a Consequence of Quantum Entanglement: A Quantum Logic Gate Space Model for the Universe
Today I read a paper titled “Relativity as a Consequence of Quantum Entanglement: A Quantum Logic Gate Space Model for the Universe”
The abstract is:
Everything in the Universe is assumed to be compromised of pure reversible quantum Toffoli gates, including empty space itself.
Empty space can be configured into photon or matter gates simply by swapping logic input information with these entities through the phenomenon of quantum mechanical entanglement between photons and empty space Toffoli gates.
The essential difference between empty space, photons and matter gates are the logic input values of their respective Toffoli gates.
Empty space is characterized by an inability for the logic inputs to influence the internal logic state of its Toffoli gates since the control lines are set to logic 0.
Photons and matter are characterized by Toffoli gates where the control lines are set to logic 1 enabling their logic inputs to control their internal logic states allowing for their interaction according to the laws of physics associated with reality.
Photons swapping logic input information with empty space results in the propagation of light.
Photons facilitating the swapping of information between matter and empty space gates leads to the laws of motion including relativity.
This model enables the derivation of many physical laws from purely quantum mechanical considerations including the Heisenberg Uncertainty Principle, the Lorentz transformations of special relativity, and the relationship between relativistic energy and mass.
The model provides a possible explanation for many physical phenomena including dark matter, anti-matter, and an inflationary Universe.