This week I am listening to “Mylo Xyloto” by Coldplay
Archives for 2012
Read – Groo: The Hogs of Horder
Today I finished reading “Groo: The Hogs of Horder” by Sergio Aragones
Read – Against All Things Ending
Today I finished reading “Against All Things Ending” by Stephen R. Donaldson
Paper – Predictors of short-term decay of cell phone contacts in a large scale communication network
Today I read a paper titled “Predictors of short-term decay of cell phone contacts in a large scale communication network”
The abstract is:
Under what conditions is an edge present in a social network at time t likely to decay or persist by some future time t + Delta(t)? Previous research addressing this issue suggests that the network range of the people involved in the edge, the extent to which the edge is embedded in a surrounding structure, and the age of the edge all play a role in edge decay.
This paper uses weighted data from a large-scale social network built from cell-phone calls in an 8-week period to determine the importance of edge weight for the decay/persistence process.
In particular, we study the relative predictive power of directed weight, embeddedness, newness, and range (measured as outdegree) with respect to edge decay and assess the effectiveness with which a simple decision tree and logistic regression classifier can accurately predict whether an edge that was active in one time period continues to be so in a future time period.
We find that directed edge weight, weighted reciprocity and time-dependent measures of edge longevity are highly predictive of whether we classify an edge as persistent or decayed, relative to the other types of factors at the dyad and neighborhood level.
Paper – The thermodynamics of human reaction times
Today I read a paper titled “The thermodynamics of human reaction times”
The abstract is:
I present a new approach for the interpretation of reaction time (RT) data from behavioral experiments.
From a physical perspective, the entropy of the RT distribution provides a model-free estimate of the amount of processing performed by the cognitive system.
In this way, the focus is shifted from the conventional interpretation of individual RTs being either long or short, into their distribution being more or less complex in terms of entropy.
The new approach enables the estimation of the cognitive processing load without reference to the informational content of the stimuli themselves, thus providing a more appropriate estimate of the cognitive impact of different sources of information that are carried by experimental stimuli or tasks.
The paper introduces the formulation of the theory, followed by an empirical validation using a database of human RTs in lexical tasks (visual lexical decision and word naming).
The results show that this new interpretation of RTs is more powerful than the traditional one.
The method provides theoretical estimates of the processing loads elicited by individual stimuli.
These loads sharply distinguish the responses from different tasks.
In addition, it provides upper-bound estimates for the speed at which the system processes information.
Finally, I argue that the theoretical proposal, and the associated empirical evidence, provide strong arguments for an adaptive system that systematically adjusts its operational processing speed to the particular demands of each stimulus.
This finding is in contradiction with Hick’s law, which posits a relatively constant processing speed within an experimental context.
Read – Dilbert and the Way of the Weasel
Today I finished reading “Dilbert and the Way of the Weasel” by Scott Adams
Paper – Rumor Evolution in Social Networks
Today I read a paper titled “Rumor Evolution in Social Networks”
The abstract is:
Social network is a main tunnel of rumor spreading.
Previous studies are concentrated on a static rumor spreading.
The content of the rumor is invariable during the whole spreading process.
Indeed, the rumor evolves constantly in its spreading process, which grows shorter, more concise, more easily grasped and told.
In an early psychological experiment, researchers found about 70% of details in a rumor were lost in the first 6 mouth-to-mouth transmissions \cite{TPR}.
Based on the facts, we investigate rumor spreading on social networks, where the content of the rumor is modified by the individuals with a certain probability.
In the scenario, they have two choices, to forward or to modify.
As a forwarder, an individual disseminates the rumor directly to its neighbors.
As a modifier, conversely, an individual revises the rumor before spreading it out.
When the rumor spreads on the social networks, for instance, scale-free networks and small-world networks, the majority of individuals actually are infected by the multi-revised version of the rumor, if the modifiers dominate the networks.
Our observation indicates that the original rumor may lose its influence in the spreading process.
Similarly, a true information may turn to be a rumor as well.
Our result suggests the rumor evolution should not be a negligible question, which may provide a better understanding of the generation and destruction of a rumor.
Studying – Designing templates with Illustrator
This month I am studying “Designing templates with Illustrator”
Listening – Days
This week I am listening to “Days” by Real Estate
Read – The 15 Invaluable Laws of Growth
Today I finished reading “The 15 Invaluable Laws of Growth: Live Them and Reach Your Potential” by John Maxwell
Paper – Time-Dependent 2-D Vector Field Topology: An Approach Inspired by Lagrangian Coherent Structures
Today I read a paper titled “Time-Dependent 2-D Vector Field Topology: An Approach Inspired by Lagrangian Coherent Structures”
The abstract is:
This paper presents an approach to a time-dependent variant of the concept of vector field topology for 2-D vector fields.
Vector field topology is defined for steady vector fields and aims at discriminating the domain of a vector field into regions of qualitatively different behaviour.
The presented approach represents a generalization for saddle-type critical points and their separatrices to unsteady vector fields based on generalized streak lines, with the classical vector field topology as its special case for steady vector fields.
The concept is closely related to that of Lagrangian coherent structures obtained as ridges in the finite-time Lyapunov exponent field.
The proposed approach is evaluated on both 2-D time-dependent synthetic and vector fields from computational fluid dynamics.
Listening – The English Riviera
This week I am listening to “The English Riviera” by Metronomy
Read – Mining the Social Web
Today I finished reading “Mining the Social Web: Analyzing Data from Facebook, Twitter, LinkedIn, and Other Social Media Sites” by Matthew Russell
Paper – Computer Model of a “Sense of Humour”. II. Realization in Neural Networks
Today I read a paper titled “Computer Model of a “Sense of Humour”. II. Realization in Neural Networks”
The abstract is:
The computer realization of a “sense of humour” requires the creation of an algorithm for solving the “linguistic problem”, i.e.
the problem of recognizing a continuous sequence of polysemantic images.
Such algorithm may be realized in the Hopfield model of a neural network after its proper modification.
Paper – 3D Face Recognition with Sparse Spherical Representations
Today I read a paper titled “3D Face Recognition with Sparse Spherical Representations”
The abstract is:
This paper addresses the problem of 3D face recognition using simultaneous sparse approximations on the sphere.
The 3D face point clouds are first aligned with a novel and fully automated registration process.
They are then represented as signals on the 2D sphere in order to preserve depth and geometry information.
Next, we implement a dimensionality reduction process with simultaneous sparse approximations and subspace projection.
It permits to represent each 3D face by only a few spherical functions that are able to capture the salient facial characteristics, and hence to preserve the discriminant facial information.
We eventually perform recognition by effective matching in the reduced space, where Linear Discriminant Analysis can be further activated for improved recognition performance.
The 3D face recognition algorithm is evaluated on the FRGC v.1.0 data set, where it is shown to outperform classical state-of-the-art solutions that work with depth images.
Paper – Learning Unification-Based Natural Language Grammars
Today I read a paper titled “Learning Unification-Based Natural Language Grammars”
The abstract is:
When parsing unrestricted language, wide-covering grammars often undergenerate.
Undergeneration can be tackled either by sentence correction, or by grammar correction.
This thesis concentrates upon automatic grammar correction (or machine learning of grammar) as a solution to the problem of undergeneration.
Broadly speaking, grammar correction approaches can be classified as being either {\it data-driven}, or {\it model-based}.
Data-driven learners use data-intensive methods to acquire grammar.
They typically use grammar formalisms unsuited to the needs of practical text processing and cannot guarantee that the resulting grammar is adequate for subsequent semantic interpretation.
That is, data-driven learners acquire grammars that generate strings that humans would judge to be grammatically ill-formed (they {\it overgenerate}) and fail to assign linguistically plausible parses.
Model-based learners are knowledge-intensive and are reliant for success upon the completeness of a {\it model of grammaticality}.
But in practice, the model will be incomplete.
Given that in this thesis we deal with undergeneration by learning, we hypothesise that the combined use of data-driven and model-based learning would allow data-driven learning to compensate for model-based learning’s incompleteness, whilst model-based learning would compensate for data-driven learning’s unsoundness.
We describe a system that we have used to test the hypothesis empirically.
The system combines data-driven and model-based learning to acquire unification-based grammars that are more suitable for practical text parsing.
Using the Spoken English Corpus as data, and by quantitatively measuring undergeneration, overgeneration and parse plausibility, we show that this hypothesis is correct..
Listening – Wasting Light
This week I am listening to “Wasting Light” by Foo Fighters
Paper – It’s Not What You Have, But How You Use It: Compromises in Mobile Device Use
Today I read a paper titled “It’s Not What You Have, But How You Use It: Compromises in Mobile Device Use”
The abstract is:
As users begin to use many more devices for personal information management (PIM) than just the traditional desktop computer, it is essential for HCI researchers to understand how these devices are being used in the wild and their roles in users’ information environments.
We conducted a study of 220 knowledge workers about their devices, the activities they performed on each, and the groups of devices used together.
Our findings indicate that several devices are often used in groups; integrated multi-function portable devices have begun to replace single-function devices for communication (e.g.
email and IM).
Users use certain features opportunistically because they happen to be carrying a multi-function device with them.
The use of multiple devices and multi-function devices is fraught with compromises as users must choose and make trade-offs among various factors.
Paper – Gesture Recognition with a Focus on Important Actions by Using a Path Searching Method in Weighted Graph
Today I read a paper titled “Gesture Recognition with a Focus on Important Actions by Using a Path Searching Method in Weighted Graph”
The abstract is:
This paper proposes a method of gesture recognition with a focus on important actions for distinguishing similar gestures.
The method generates a partial action sequence by using optical flow images, expresses the sequence in the eigenspace, and checks the feature vector sequence by applying an optimum path-searching method of weighted graph to focus the important actions.
Also presented are the results of an experiment on the recognition of similar sign language words.
Listening – Live.Love.A$AP
This week I am listening to “Live.Love.A$AP” by A$AP Rocky
Paper – A Framework for Designing 3D Virtual Environments
Today I read a paper titled “A Framework for Designing 3D Virtual Environments”
The abstract is:
The process of design and development of virtual environments can be supported by tools and frameworks, to save time in technical aspects and focusing on the content.
In this paper we present an academic framework which provides several levels of abstraction to ease this work.
It includes state-of-the-art components we devised or integrated adopting open-source solutions in order to face specific problems.
Its architecture is modular and customizable, the code is open-source.
Read – Lucky Luke #29 – Des Barbelles Sur La Prairie
Today I finished reading “Lucky Luke #29 – DES BARBELES SUR LA PRAIRIE” by Rene Goscinny
Read – The 10X Rule
Today I finished reading “The 10X Rule: The Only Difference Between Success and Failure” by Grant Cardone
Paper – Singularity Analysis of Limited-dof Parallel Manipulators using Grassmann-Cayley Algebra
Today I read a paper titled “Singularity Analysis of Limited-dof Parallel Manipulators using Grassmann-Cayley Algebra”
The abstract is:
This paper characterizes geometrically the singularities of limited DOF parallel manipulators.
The geometric conditions associated with the dependency of six Pl\”ucker vector of lines (finite and infinite) constituting the rows of the inverse Jacobian matrix are formulated using Grassmann-Cayley algebra.
Manipulators under consideration do not need to have a passive spherical joint somewhere in each leg.
This study is illustrated with three example robots .
Listening – Every Kingdom
This week I am listening to “Every Kingdom” by Ben Howard
Paper – Robot Swarms in an Uncertain World: Controllable Adaptability
Today I read a paper titled “Robot Swarms in an Uncertain World: Controllable Adaptability”
The abstract is:
There is a belief that complexity and chaos are essential for adaptability.
But life deals with complexity every moment, without the chaos that engineers fear so, by invoking goal-directed behaviour.
Goals can be programmed.
That is why living organisms give us hope to achieve adaptability in robots.
In this paper a method for the description of a goal-directed, or programmed, behaviour, interacting with uncertainty of environment, is described.
We suggest reducing the structural (goals, intentions) and stochastic components (probability to realise the goal) of individual behaviour to random variables with nominal values to apply probabilistic approach.
This allowed us to use a Normalized Entropy Index to detect the system state by estimating the contribution of each agent to the group behaviour.
The number of possible group states is 27.
We argue that adaptation has a limited number of possible paths between these 27 states.
Paths and states can be programmed so that after adjustment to any particular case of task and conditions, adaptability will never involve chaos.
We suggest the application of the model to operation of robots or other devices in remote and/or dangerous places.
Read – The Feynman Lectures on Physics Vol 6: On Fundamentals/Kinetics & Heat
Today I finished reading “The Feynman Lectures on Physics Vol 6: On Fundamentals/Kinetics & Heat” by Richard Feynman
Paper – An Estimation of the Shortest and Largest Average Path Length in Graphs of Given Density
Today I read a paper titled “An Estimation of the Shortest and Largest Average Path Length in Graphs of Given Density”
The abstract is:
Many real world networks (graphs) are observed to be ‘small worlds’, i.e., the average path length among nodes is small.
On the other hand, it is somewhat unclear what other average path length values networks can produce.
In particular, it is not known what the maximum and the minimum average path length values are.
In this paper we provide a lower estimation for the shortest average path length (l) values in connected networks, and the largest possible average path length values in networks with given size and density.
To the latter end, we construct a special family of graphs and calculate their average path lengths.
We also demonstrate the correctness of our estimation by simulations.
Studying – Hand lettering with Illustrator
This month I am studying “Hand lettering with Illustrator”
Listening – Ceremonials
This week I am listening to “Ceremonials” by Florence And The Machine