This month I am studying “Tribal illustration with Illustrator”
Listening – The Year Of Hibernation
This week I am listening to “The Year Of Hibernation” by Youth Lagoon
Paper – Dynamics of Majority Rule with Differential Latencies
Today I read a paper titled “Dynamics of Majority Rule with Differential Latencies”
The abstract is:
We investigate the dynamics of the majority-rule opinion formation model when voters experience differential latencies.
With this extension, voters that just adopted an opinion go into a latent state during which they are excluded from the opinion formation process.
The duration of the latent state depends on the opinion adopted by the voter.
The net result is a bias towards consensus on the opinion that is associated with the shorter latency.
We determine the exit probability and time to consensus for systems of $N$ voters.
Additionally, we derive an asymptotic characterisation of the time to consensus by means of a continuum model.
Paper – An Intelligent Multi-Agent Recommender System for Human Capacity Building
Today I read a paper titled “An Intelligent Multi-Agent Recommender System for Human Capacity Building”
The abstract is:
This paper presents a Multi-Agent approach to the problem of recommending training courses to engineering professionals.
The recommendation system is built as a proof of concept and limited to the electrical and mechanical engineering disciplines.
Through user modelling and data collection from a survey, collaborative filtering recommendation is implemented using intelligent agents.
The agents work together in recommending meaningful training courses and updating the course information.
The system uses a users profile and keywords from courses to rank courses.
A ranking accuracy for courses of 90% is achieved while flexibility is achieved using an agent that retrieves information autonomously using data mining techniques from websites.
This manner of recommendation is scalable and adaptable.
Further improvements can be made using clustering and recording user feedback.
Paper – Harmonic Functions for Data Reconstruction on 3D Manifolds
Today I read a paper titled “Harmonic Functions for Data Reconstruction on 3D Manifolds”
The abstract is:
In computer graphics, smooth data reconstruction on 2D or 3D manifolds usually refers to subdivision problems.
Such a method is only valid based on dense sample points.
The manifold usually needs to be triangulated into meshes (or patches) and each node on the mesh will have an initial value.
While the mesh is refined the algorithm will provide a smooth function on the redefined manifolds.
However, when data points are not dense and the original mesh is not allowed to be changed, how is the “continuous and/or smooth” reconstruction possible? This paper will present a new method using harmonic functions to solve the problem.
Our method contains the following steps: (1) Partition the boundary surfaces of the 3D manifold based on sample points so that each sample point is on the edge of the partition.
(2) Use gradually varied interpolation on the edges so that each point on edge will be assigned a value.
In addition, all values on the edge are gradually varied.
(3) Use discrete harmonic function to fit the unknown points, i.e.
the points inside each partition patch.
The fitted function will be a harmonic or a local harmonic function in each partitioned area.
The function on edge will be “near” continuous (or “near” gradually varied).
If we need a smoothed surface on the manifold, we can apply subdivision algorithms.
This paper has also a philosophical advantage over triangulation meshes.
People usually use triangulation for data reconstruction.
This paper employs harmonic functions, a generalization of triangulation because linearity is a form of harmonic.
Therefore, local harmonic initialization is more sophisticated then triangulation.
This paper is a conceptual and methodological paper.
This paper does not focus on detailed mathematical analysis nor fine algorithm design.
Listening – The King is Dead
This week I am listening to “The King is Dead” by The Decemberists
Paper – Stratified economic exchange on networks
Today I read a paper titled “Stratified economic exchange on networks”
The abstract is:
We investigate a model of stratified economic interactions between agents when the notion of spatial location is introduced.
The agents are placed on a network with near-neighbor connections.
Interactions between neighbors can occur only if the difference in their wealth is less than a threshold value that defines the width of the economic classes.
By employing concepts from spatiotemporal dynamical systems, three types of patterns can be identified in the system as parameters are varied: laminar, intermittent and turbulent states.
The transition from the laminar state to the turbulent state is characterized by the activity of the system, a quantity that measures the average exchange of wealth over long times.
The degree of inequality in the wealth distribution for different parameter values is characterized by the Gini Coefficient.
High levels of activity are associated to low values of the Gini coefficient.
It is found that the topological properties of the network have little effect on the activity of the system, but the Gini coefficient increases when the clustering coefficient of the network is increased.
Paper – Modeling Microscopic Chemical Sensors in Capillaries
Today I read a paper titled “Modeling Microscopic Chemical Sensors in Capillaries”
The abstract is:
Nanotechnology-based microscopic robots could provide accurate in vivo measurement of chemicals in the bloodstream for detailed biological research and as an aid to medical treatment.
Quantitative performance estimates of such devices require models of how chemicals in the blood diffuse to the devices.
This paper models microscopic robots and red blood cells (erythrocytes) in capillaries using realistic distorted cell shapes.
The models evaluate two sensing scenarios: robots moving with the cells past a chemical source on the vessel wall, and robots attached to the wall for longer-term chemical monitoring.
Using axial symmetric geometry with realistic flow speeds and diffusion coefficients, we compare detection performance with a simpler model that does not include the cells.
The average chemical absorption is quantitatively similar in both models, indicating the simpler model is an adequate design guide to sensor performance in capillaries.
However, determining the variation in forces and absorption as cells move requires the full model.
Watching – Revenge of the Electric Car
Today I watched “Revenge of the Electric Car”
Read – Everyone Communicates, Few Connect
Today I finished reading “Everyone Communicates, Few Connect: What the Most Effective People Do Differently” by John C. Maxwell
Paper – Virtual Environments for Training: From Individual Learning to Collaboration with Humanoids
Today I read a paper titled “Virtual Environments for Training: From Individual Learning to Collaboration with Humanoids”
The abstract is:
The next generation of virtual environments for training is oriented towards collaborative aspects.
Therefore, we have decided to enhance our platform for virtual training environments, adding collaboration opportunities and integrating humanoids.
In this paper we put forward a model of humanoid that suits both virtual humans and representations of real users, according to collaborative training activities.
We suggest adaptations to the scenario model of our platform making it possible to write collaborative procedures.
We introduce a mechanism of action selection made up of a global repartition and an individual choice.
These models are currently being integrated and validated in GVT, a virtual training tool for maintenance of military equipments, developed in collaboration with the French company NEXTER-Group.
Listening – Nine Types Of Light
This week I am listening to “Nine Types Of Light” by TV On The Radio
Read – Game Development Essentials: Mobile Game Development
Today I finished reading “Game Development Essentials: Mobile Game Development” by Kimberly Unger
Paper – Efficient IRIS Recognition through Improvement of Feature Extraction and subset Selection
Today I read a paper titled “Efficient IRIS Recognition through Improvement of Feature Extraction and subset Selection”
The abstract is:
The selection of the optimal feature subset and the classification has become an important issue in the field of iris recognition.
In this paper we propose several methods for iris feature subset selection and vector creation.
The deterministic feature sequence is extracted from the iris image by using the contourlet transform technique.
Contourlet transform captures the intrinsic geometrical structures of iris image.
It decomposes the iris image into a set of directional sub-bands with texture details captured in different orientations at various scales so for reducing the feature vector dimensions we use the method for extract only significant bit and information from normalized iris images.
In this method we ignore fragile bits.
And finally we use SVM (Support Vector Machine) classifier for approximating the amount of people identification in our proposed system.
Experimental result show that most proposed method reduces processing time and increase the classification accuracy and also the iris feature vector length is much smaller versus the other methods.
Paper – On-Line Tests
Today I read a paper titled “On-Line Tests”
The abstract is:
This paper presents an interactive implementation which makes the link between a human operator and a system of a administration of a relational databases MySQL.
This application conceived as a multimedia presentations is illustrative for the way in which the transfer and the remaking of the information between the human operator, the module of data processing and the database which stores the informations can be solved (with help of the PHP language and the web use).
Listening – Heritage
This week I am listening to “Heritage” by Opeth
Paper – Design of moveable and resizable graphics
Today I read a paper titled “Design of moveable and resizable graphics”
The abstract is:
We are communicating with computers on two different levels.
On upper level we have a very flexible system of windows: we can move them, resize, overlap or put side by side.
At any moment we decide what would be the best view and reorganize the whole view easily.
Then we start any application, go to the inner level, and everything changes.
Here we are stripped of all the flexibility and can work only inside the scenario, developed by the designer of the program.
Interface will allow us to change some tiny details, but in general everything is fixed: graphics is neither moveable, nor resizable, and the same with controls.
Author designed an extremely powerful mechanism of turning graphical objects and controls into moveable and resizable.
This can not only significantly improve the existing applications, but this will bring the applications to another level.
(To estimate the possible difference, try to imagine the Windows system without its flexibility and compare it with the current one.) This article explains in details the construction and use of moveable and resizable graphical objects.
Read – Yotsuba! Vol. 11
Today I finished reading “Yotsuba! Vol. 11” by Kiyohiko Azuma
Paper – Dynamic Logic of Common Knowledge in a Proof Assistant
Today I read a paper titled “Dynamic Logic of Common Knowledge in a Proof Assistant”
The abstract is:
Common Knowledge Logic is meant to describe situations of the real world where a group of agents is involved.
These agents share knowledge and make strong statements on the knowledge of the other agents (the so called \emph{common knowledge}).
But as we know, the real world changes and overall information on what is known about the world changes as well.
The changes are described by dynamic logic.
To describe knowledge changes, dynamic logic should be combined with logic of common knowledge.
In this paper we describe experiments which we have made about the integration in a unique framework of common knowledge logic and dynamic logic in the proof assistant \Coq.
This results in a set of fully checked proofs for readable statements.
We describe the framework and how a proof can be .
Watching – Serenity
Today I watched “Serenity”
Studying – Lettering comic books with Illustrator
This month I am studying “Lettering comic books with Illustrator”
Listening – Watch The Throne
This week I am listening to “Watch The Throne” by Jay-Z and Kanye West
Read – Usagi Yojimbo #25: Fox Hunt
Today I finished reading “Usagi Yojimbo #25: Fox Hunt” by Stan Sakai
Paper – Optimal Sensor Configurations for Rectangular Target Dectection
Today I read a paper titled “Optimal Sensor Configurations for Rectangular Target Dectection”
The abstract is:
Optimal search strategies where targets are observed at several different angles are found.
Targets are assumed to exhibit rectangular symmetry and have a uniformly-distributed orientation.
By rectangular symmetry, it is meant that one side of a target is the mirror image of its opposite side.
Finding an optimal solution is generally a hard problem.
Fortunately, symmetry principles allow analytical and intuitive solutions to be found.
One such optimal search strategy consists of choosing n angles evenly separated on the half-circle and leads to a lower bound of the probability of not detecting targets.
As no prior knowledge of the target orientation is required, such search strategies are also robust, a desirable feature in search and detection missions.
Paper – The Information Theory of Emotions of Musical Chords
Today I read a paper titled “The Information Theory of Emotions of Musical Chords”
The abstract is:
The paper offers a solution to the centuries-old puzzle – why the major chords are perceived as happy and the minor chords as sad – based on the information theory of emotions.
A theory and a formula of musical emotions were created.
They define the sign and the amplitude of the utilitarian emotional coloration of separate major and minor chords through relative pitches of constituent sounds.
Keywords: chord, major, minor, the formula of musical emotions, the information theory of emotions.
Paper – Size matters: performance declines if your pixels are too big or too small
Today I read a paper titled “Size matters: performance declines if your pixels are too big or too small”
The abstract is:
We present a conceptual model that describes the effect of pixel size on target acquisition.
We demonstrate the use of our conceptual model by applying it to predict and explain the results of an experiment to evaluate users’ performance in a target acquisition task involving three distinct display sizes: standard desktop, small and large displays.
The results indicate that users are fastest on standard desktop displays, undershoots are the most common error on small displays and overshoots are the most common error on large displays.
We propose heuristics to maintain usability when changing displays.
Finally, we contribute to the growing body of evidence that amplitude does affect performance in a display-based pointing task.
Paper – SimDialog: A visual game dialog editor
Today I read a paper titled “SimDialog: A visual game dialog editor”
The abstract is:
SimDialog is a visual editor for dialog in computer games.
This paper presents the design of SimDialog, illustrating how script writers and non-programmers can easily create dialog for video games with complex branching structures and dynamic response characteristics.
The system creates dialog as a directed graph.
This allows for play using the dialog with a state-based cause and effect system that controls selection of non-player character responses and can provide a basic scoring mechanism for games.
Paper – Persistent Robotic Tasks: Monitoring and Sweeping in Changing Environments
Today I read a paper titled “Persistent Robotic Tasks: Monitoring and Sweeping in Changing Environments”
The abstract is:
We present controllers that enable mobile robots to persistently monitor or sweep a changing environment.
The changing environment is modeled as a field which grows in locations that are not within range of a robot, and decreases in locations that are within range of a robot.
We assume that the robots travel on given closed paths.
The speed of each robot along its path is controlled to prevent the field from growing unbounded at any location.
We consider the space of speed controllers that can be parametrized by a finite set of basis functions.
For a single robot, we develop a linear program that is guaranteed to compute a speed controller in this space to keep the field bounded, if such a controller exists.
Another linear program is then derived whose solution is the speed controller that minimizes the maximum field value over the environment.
We extend our linear program formulation to develop a multi-robot controller that keeps the field bounded.
The multi-robot controller has the unique feature that it does not require communication among the robots.
Simulation studies demonstrate the robustness of the controllers to modeling errors, and to stochasticity in the environment.
Listening – Submarine
This week I am listening to “Submarine” by Alex Turner
Read – Game Programming Patterns
Today I finished reading “Game Programming Patterns” by Robert Nystrom
Read – The Walking Dead, Book Seven
Today I finished reading “The Walking Dead, Book Seven” by Robert Kirkman
Paper – Narrative Bridging – a specification of a modelling method for game design
Today I read a paper titled “Narrative Bridging – a specification of a modelling method for game design”
The abstract is:
Very little has been explored about the narrative as a process when constructing entertainment for interactive media.
Simultaneously, the interest in narrative vehicles increases while certain occupations, seeing the narrative as a structure, obscure the examination of the process of selecting, arranging and rendering story material.
To correct this deficiency, a method for a narrative bridging that encourages research and design while exploring narration as a process, is proposed with the aim to not diminish the properties of the interactive media.
This method focuses on the initial phase where establishing and handling the information takes place and creates a foundation that precedes its systematization and computation.
The aim is to give designers a comfortable design tool that firmly aids the design without interfering with creativity, and at the same time aids the construction of interplay between narration, spatiality and interactivity.
The method aided the practise of a discipline that was established through a qualitative study conducted as part of a university course in rapid prototyping.
The results demonstrated that the method aided time-constrained design processes, simultaneously detecting inconsistencies that would prevent the team from making improvements.
The method gave the team a shared vocabulary and outlook, allowing them to progress without interfering with the creative flow.
This enabled the team to reason about the process and easily advice design stakeholders.
The study also provides directions for future developments within research of narrative processes in game design.
Paper – Affective Ludology, Flow and Immersion in a First- Person Shooter: Measurement of Player Experience
Today I read a paper titled “Affective Ludology, Flow and Immersion in a First- Person Shooter: Measurement of Player Experience”
The abstract is:
Gameplay research about experiential phenomena is a challenging undertaking, given the variety of experiences that gamers encounter when playing and which currently do not have a formal taxonomy, such as flow, immersion, boredom, and fun.
These informal terms require a scientific explanation.
Ludologists also acknowledge the need to understand cognition, emotion, and goal- oriented behavior of players from a psychological perspective by establishing rigorous methodologies.
This paper builds upon and extends prior work in an area for which we would like to coin the term “affective ludology.” The area is concerned with the affective measurement of player-game interaction.
The experimental study reported here investigated different traits of gameplay experience using subjective (i.e., questionnaires) and objective (i.e., psychophysiological) measures.
Participants played three Half-Life 2 game level design modifications while measures such as electromyography (EMG), electrodermal activity (EDA) were taken and questionnaire responses were collected.
A level designed for combat-oriented flow experience demonstrated significant high-arousal positive affect emotions.
This method shows that emotional patterns emerge from different level designs, which has great potential for providing real-time emotional profiles of gameplay that may be generated together with self- reported subjective player experience descriptions.
Read – Chemistry Demystified
Today I finished reading “Chemistry Demystified” by Linda Williams
Paper – Quarantine generated phase transition in epidemic spreading
Today I read a paper titled “Quarantine generated phase transition in epidemic spreading”
The abstract is:
We study the critical effect of quarantine on the propagation of epidemics on an adaptive network of social contacts.
For this purpose, we analyze the susceptible-infected-recovered (SIR) model in the presence of quarantine, where susceptible individuals protect themselves by disconnecting their links to infected neighbors with probability w, and reconnecting them to other susceptible individuals chosen at random.
Starting from a single infected individual, we show by an analytical approach and simulations that there is a phase transition at a critical rewiring (quarantine) threshold w_c separating a phase (w
We find that in our model the topology of the network strongly affects the size of the propagation, and that w_c increases with the mean degree and heterogeneity of the network.
We also find that w_c is reduced if we perform a preferential rewiring, in which the rewiring probability is proportional to the degree of infected nodes.
Listening – Knife Man
This week I am listening to “Knife Man” by Andrew Jackson Jihad
Facebooked
Somedays I wish I worked at Facebook so it never looked like I was procrastinating
Paper – On-the-fly erasure coding for real-time video applications
Today I read a paper titled “On-the-fly erasure coding for real-time video applications”
The abstract is:
This paper introduces a robust point-to-point transmission scheme: Tetrys, that relies on a novel on-the-fly erasure coding concept which reduces the delay for recovering lost data at the receiver side.
In current erasure coding schemes, the packets that are not rebuilt at the receiver side are either lost or delayed by at least one RTT before transmission to the application.
The present contribution aims at demonstrating that Tetrys coding scheme can fill the gap between real-time applications requirements and full reliability.
Indeed, we show that in several cases, Tetrys can recover lost packets below one RTT over lossy and best-effort networks.
We also show that Tetrys allows to enable full reliability without delay compromise and as a result: significantly improves the performance of time constrained applications.
For instance, our evaluations present that video-conferencing applications obtain a PSNR gain up to 7dB compared to classic block-based erasure codes.
Read – Artificial Intelligence for Computer Games
Today I finished reading “Artificial Intelligence for Computer Games” by Pedro Antonio Gonzalez Calero
Listening – Mylo Xyloto
This week I am listening to “Mylo Xyloto” by Coldplay
Read – Groo: The Hogs of Horder
Today I finished reading “Groo: The Hogs of Horder” by Sergio Aragones
Read – Against All Things Ending
Today I finished reading “Against All Things Ending” by Stephen R. Donaldson
Paper – Predictors of short-term decay of cell phone contacts in a large scale communication network
Today I read a paper titled “Predictors of short-term decay of cell phone contacts in a large scale communication network”
The abstract is:
Under what conditions is an edge present in a social network at time t likely to decay or persist by some future time t + Delta(t)? Previous research addressing this issue suggests that the network range of the people involved in the edge, the extent to which the edge is embedded in a surrounding structure, and the age of the edge all play a role in edge decay.
This paper uses weighted data from a large-scale social network built from cell-phone calls in an 8-week period to determine the importance of edge weight for the decay/persistence process.
In particular, we study the relative predictive power of directed weight, embeddedness, newness, and range (measured as outdegree) with respect to edge decay and assess the effectiveness with which a simple decision tree and logistic regression classifier can accurately predict whether an edge that was active in one time period continues to be so in a future time period.
We find that directed edge weight, weighted reciprocity and time-dependent measures of edge longevity are highly predictive of whether we classify an edge as persistent or decayed, relative to the other types of factors at the dyad and neighborhood level.
Paper – The thermodynamics of human reaction times
Today I read a paper titled “The thermodynamics of human reaction times”
The abstract is:
I present a new approach for the interpretation of reaction time (RT) data from behavioral experiments.
From a physical perspective, the entropy of the RT distribution provides a model-free estimate of the amount of processing performed by the cognitive system.
In this way, the focus is shifted from the conventional interpretation of individual RTs being either long or short, into their distribution being more or less complex in terms of entropy.
The new approach enables the estimation of the cognitive processing load without reference to the informational content of the stimuli themselves, thus providing a more appropriate estimate of the cognitive impact of different sources of information that are carried by experimental stimuli or tasks.
The paper introduces the formulation of the theory, followed by an empirical validation using a database of human RTs in lexical tasks (visual lexical decision and word naming).
The results show that this new interpretation of RTs is more powerful than the traditional one.
The method provides theoretical estimates of the processing loads elicited by individual stimuli.
These loads sharply distinguish the responses from different tasks.
In addition, it provides upper-bound estimates for the speed at which the system processes information.
Finally, I argue that the theoretical proposal, and the associated empirical evidence, provide strong arguments for an adaptive system that systematically adjusts its operational processing speed to the particular demands of each stimulus.
This finding is in contradiction with Hick’s law, which posits a relatively constant processing speed within an experimental context.
Read – Dilbert and the Way of the Weasel
Today I finished reading “Dilbert and the Way of the Weasel” by Scott Adams
Paper – Rumor Evolution in Social Networks
Today I read a paper titled “Rumor Evolution in Social Networks”
The abstract is:
Social network is a main tunnel of rumor spreading.
Previous studies are concentrated on a static rumor spreading.
The content of the rumor is invariable during the whole spreading process.
Indeed, the rumor evolves constantly in its spreading process, which grows shorter, more concise, more easily grasped and told.
In an early psychological experiment, researchers found about 70% of details in a rumor were lost in the first 6 mouth-to-mouth transmissions \cite{TPR}.
Based on the facts, we investigate rumor spreading on social networks, where the content of the rumor is modified by the individuals with a certain probability.
In the scenario, they have two choices, to forward or to modify.
As a forwarder, an individual disseminates the rumor directly to its neighbors.
As a modifier, conversely, an individual revises the rumor before spreading it out.
When the rumor spreads on the social networks, for instance, scale-free networks and small-world networks, the majority of individuals actually are infected by the multi-revised version of the rumor, if the modifiers dominate the networks.
Our observation indicates that the original rumor may lose its influence in the spreading process.
Similarly, a true information may turn to be a rumor as well.
Our result suggests the rumor evolution should not be a negligible question, which may provide a better understanding of the generation and destruction of a rumor.
Studying – Designing templates with Illustrator
This month I am studying “Designing templates with Illustrator”
Listening – Days
This week I am listening to “Days” by Real Estate
Read – The 15 Invaluable Laws of Growth
Today I finished reading “The 15 Invaluable Laws of Growth: Live Them and Reach Your Potential” by John Maxwell
Paper – Time-Dependent 2-D Vector Field Topology: An Approach Inspired by Lagrangian Coherent Structures
Today I read a paper titled “Time-Dependent 2-D Vector Field Topology: An Approach Inspired by Lagrangian Coherent Structures”
The abstract is:
This paper presents an approach to a time-dependent variant of the concept of vector field topology for 2-D vector fields.
Vector field topology is defined for steady vector fields and aims at discriminating the domain of a vector field into regions of qualitatively different behaviour.
The presented approach represents a generalization for saddle-type critical points and their separatrices to unsteady vector fields based on generalized streak lines, with the classical vector field topology as its special case for steady vector fields.
The concept is closely related to that of Lagrangian coherent structures obtained as ridges in the finite-time Lyapunov exponent field.
The proposed approach is evaluated on both 2-D time-dependent synthetic and vector fields from computational fluid dynamics.