This week I am listening to “The Dark Knight” by Hans Zimmer & James Newton Howard
Archives for 2009
Read – Living the 80/20 Way
Today I finished reading “Living the 80/20 Way: Work Less, Worry Less, Succeed More, Enjoy More” by Richard Koch
Finest Boned China
Sat on the couch at my future Mother-in-law’s house when out of the blue she says: “Another new laptop? You only just bought one. You should not go throwing your money away like that.”
I turn and point at the china cabinet over her shoulder. “Not once has anybody in this household been allowed to ever use or even lift one of those decorative plates.”
“Those bring me comfort!” she exclaimed. “And they’re for guests. What does the laptop get you?”
“About a half-million dollars a year.” I responded.
Read – More Joel on Software
Today I finished reading “More Joel on Software: Further Thoughts on Diverse and Occasionally Related Matters That Will Prove of Interest to Software Developers, Designers, and Managers, and to Those Who, Whether by Good Fortune or Ill Luck, Work with Them in Some Capacity” by Joel Spolsky
Studying – Creating art nouveau posters
This month I am studying “Creating art nouveau posters”
This month I am studying how to create posters in the Art Noeveau style in a class called, oddly enough, “Creating art nouveau posters.”
Every time I think of art nouveau I think “art deco.”
The two aren’t exactly related and are world’s apart in their aesthetic.
Art nouveau’s period is late 19th century, circa 1880, and was in response to the industrial revolution and the societal upheaval that was wrought by that.
Art deco was early 20th century, circa 1920, and was in response (partially) to the depression of the World War I era.
Art nouveau is the flowery one.
Art deco is the streamlined one.
But it is still fixed in my head that when I hear “art nouveau” my brain flits to the “art deco” lamps in our bedroom.
Either way, it will be interesting to study the drawing style and create some overly decorated posters in the appropriate style.
Paper – Facing the Facts
Today I read a paper titled “Facing the Facts”
The abstract is:
Human error research on overconfidence supports the benefits of early visibility of defects and disciplined development.
If risk to the enterprise is to be reduced, individuals need to become aware of the reality of the quality of their work.
Several cycles of inspection and defect removal are inevitable.
Software Quality Management measurements of defect density and removal efficiency are applicable.
Research of actual spreadsheet error rates shows data consistent with other software depending on the extent to which the work product was reviewed before inspection.
The paper argues that the payback for an investment in early review time is justified by the saving in project delay and expensive errors in use.
‘If debugging is the process of removing bugs, then programming must be the process of putting them in’ – Anon.
Paper – Algorithms for Rapidly Dispersing Robot Swarms in Unknown Environments
Today I read a paper titled “Algorithms for Rapidly Dispersing Robot Swarms in Unknown Environments”
The abstract is:
We develop and analyze algorithms for dispersing a swarm of primitive robots in an unknown environment, R.
The primary objective is to minimize the makespan, that is, the time to fill the entire region.
An environment is composed of pixels that form a connected subset of the integer grid.
There is at most one robot per pixel and robots move horizontally or vertically at unit speed.
Robots enter R by means of k>=1 door pixels Robots are primitive finite automata, only having local communication, local sensors, and a constant-sized memory.
We first give algorithms for the single-door case (i.e., k=1), analyzing the algorithms both theoretically and experimentally.
We prove that our algorithms have optimal makespan 2A-1, where A is the area of R.
We next give an algorithm for the multi-door case (k>1), based on a wall-following version of the leader-follower strategy.
We prove that our strategy is O(log(k+1))-competitive, and that this bound is tight for our strategy and other related strategies.
Paper – Multiagent Approach for the Representation of Information in a Decision Support System
Today I read a paper titled “Multiagent Approach for the Representation of Information in a Decision Support System”
The abstract is:
In an emergency situation, the actors need an assistance allowing them to react swiftly and efficiently.
In this prospect, we present in this paper a decision support system that aims to prepare actors in a crisis situation thanks to a decision-making support.
The global architecture of this system is presented in the first part.
Then we focus on a part of this system which is designed to represent the information of the current situation.
This part is composed of a multiagent system that is made of factual agents.
Each agent carries a semantic feature and aims to represent a partial part of a situation.
The agents develop thanks to their interactions by comparing their semantic features using proximity measures and according to specific ontologies.
Read – Drive
Today I finished reading “Drive: The Surprising Truth About What Motivates Us” by Daniel Pink
Read – Infinity and the Mind
Today I finished reading “Infinity and the Mind: The Science and Philosophy of the Infinite” by Rudy Rucker
Read – Onward
Today I finished reading “Onward: How Starbucks Fought for Its Life without Losing Its Soul” by Howard Schultz
Listening – Deathconsciousness
This week I am listening to “Deathconsciousness” by Have A Nice Life
Paper – Wavelet and Curvelet Moments for Image Classification: Application to Aggregate Mixture Grading
Today I read a paper titled “Wavelet and Curvelet Moments for Image Classification: Application to Aggregate Mixture Grading”
The abstract is:
We show the potential for classifying images of mixtures of aggregate, based themselves on varying, albeit well-defined, sizes and shapes, in order to provide a far more effective approach compared to the classification of individual sizes and shapes.
While a dominant (additive, stationary) Gaussian noise component in image data will ensure that wavelet coefficients are of Gaussian distribution, long tailed distributions (symptomatic, for example, of extreme values) may well hold in practice for wavelet coefficients.
Energy (2nd order moment) has often been used for image characterization for image content-based retrieval, and higher order moments may be important also, not least for capturing long tailed distributional behavior.
In this work, we assess 2nd, 3rd and 4th order moments of multiresolution transform — wavelet and curvelet transform — coefficients as features.
As analysis methodology, taking account of image types, multiresolution transforms, and moments of coefficients in the scales or bands, we use correspondence analysis as well as k-nearest neighbors supervised classification.
Paper – Lattice Gas Cellular Automata for Computational Fluid Animation
Today I read a paper titled “Lattice Gas Cellular Automata for Computational Fluid Animation”
The abstract is:
The past two decades showed a rapid growing of physically-based modeling of fluids for computer graphics applications.
In this area, a common top down approach is to model the fluid dynamics by Navier-Stokes equations and apply a numerical techniques such as Finite Differences or Finite Elements for the simulation.
In this paper we focus on fluid modeling through Lattice Gas Cellular Automata (LGCA) for computer graphics applications.
LGCA are discrete models based on point particles that move on a lattice, according to suitable and simple rules in order to mimic a fully molecular dynamics.
By Chapman-Enskog expansion, a known multiscale technique in this area, it can be demonstrated that the Navier-Stokes model can be reproduced by the LGCA technique.
Thus, with LGCA we get a fluid model that does not require solution of complicated equations.
Therefore, we combine the advantage of the low computational cost of LGCA and its ability to mimic the realistic fluid dynamics to develop a new animating framework for computer graphics applications.
In this work, we discuss the theoretical elements of our proposal and show experimental results.
Read – Einstein: The Life of a Genius
Today I finished reading “Einstein: The Life of a Genius” by Walter Isaacson
Read – Yotsuba #8
Today I finished reading “Yotsuba #8” by Kiyohiko Azuma
Listening – The Bedlam In Goliath
This week I am listening to “The Bedlam In Goliath” by The Mars Volta
Read – City at the End of Time
Today I finished reading “City at the End of Time” by Greg Bear
Studying – Foundations of drawing light and shadow
This month I am studying “Foundations of drawing light and shadow”
Read – Little Teal Book of Trust
Today I finished reading “Little Teal Book of Trust: How to Earn It, Grow It, and Keep It to Become a Trusted Advisor in Sales, Business and Life” by Jeffrey Gitomer
Listening – Un Dia
This week I am listening to “Un Dia” by Juana Molina
Paper – Covering selfish machines
Today I read a paper titled “Covering selfish machines”
The abstract is:
We consider the machine covering problem for selfish related machines.
For a constant number of machines, m, we show a monotone polynomial time approximation scheme (PTAS) with running time that is linear in the number of jobs.
It uses a new technique for reducing the number of jobs while remaining close to the optimal solution.
We also present an FPTAS for the classical machine covering problem (the previous best result was a PTAS) and use this to give a monotone FPTAS.
Additionally, we give a monotone approximation algorithm with approximation ratio \min(m,(2+\eps)s_1/s_m) where \eps>0 can be chosen arbitrarily small and s_i is the (real) speed of machine i.
Finally we give improved results for two machines.
Our paper presents the first results for this problem in the context of selfish machines.
Paper – A Recommender System based on the Immune Network
Today I read a paper titled “A Recommender System based on the Immune Network”
The abstract is:
The immune system is a complex biological system with a highly distributed, adaptive and self-organising nature.
This paper presents an artificial immune system (AIS) that exploits some of these characteristics and is applied to the task of film recommendation by collaborative filtering (CF).
Natural evolution and in particular the immune system have not been designed for classical optimisation.
However, for this problem, we are not interested in finding a single optimum.
Rather we intend to identify a sub-set of good matches on which recommendations can be based.
It is our hypothesis that an AIS built on two central aspects of the biological immune system will be an ideal candidate to achieve this: Antigen – antibody interaction for matching and antibody – antibody interaction for diversity.
Computational results are presented in support of this conjecture and compared to those found by other CF techniques.
Paper – Uncovering Plagiarism Networks
Today I read a paper titled “Uncovering Plagiarism Networks”
The abstract is:
Plagiarism detection in educational programming assignments is still a problematic issue in terms of resource waste, ethical controversy, legal risks, and technical complexity.
This paper presents AC, a modular plagiarism detection system.
The design is portable across platforms and assignment formats and provides easy extraction into the internal assignment representation.
Multiple similarity measures have been incorporated, both existing and newly-developed.
Statistical analysis and several graphical visualizations aid in the interpretation of analysis results.
The system has been evaluated with a survey that encompasses several academic semesters of use at the authors’ institution.
Read – Conan #5: Rogues in the House and Other Stories
Today I finished reading “Conan #5: Rogues in the House and Other Stories” by Timothy Truman
Paper – Design of an Electro-Hydraulic System Using Neuro-Fuzzy Techniques
Today I read a paper titled “Design of an Electro-Hydraulic System Using Neuro-Fuzzy Techniques”
The abstract is:
Increasing demands in performance and quality make drive systems fundamental parts in the progressive automation of industrial processes.
Their conventional models become inappropriate and have limited scope if one requires a precise and fast performance.
So, it is important to incorporate learning capabilities into drive systems in such a way that they improve their accuracy in realtime, becoming more autonomous agents with some degree of intelligence.
To investigate this challenge, this chapter presents the development of a learning control system that uses neuro-fuzzy techniques in the design of a tracking controller to an experimental electro-hydraulic actuator.
We begin the chapter by presenting the neuro-fuzzy modeling process of the actuator.
This part surveys the learning algorithm, describes the laboratorial system, and presents the modeling steps as the choice of actuator representative variables, the acquisition of training and testing data sets, and the acquisition of the neuro-fuzzy inverse-model of the actuator.
In the second part of the chapter, we use the extracted neuro-fuzzy model and its learning capabilities to design the actuator position controller based on the feedback-error-learning technique.
Through a set of experimental results, we show the generalization properties of the controller, its learning capability in actualizing in realtime the initial neuro-fuzzy inverse-model, and its compensation action improving the electro-hydraulics tracking performance.
Listening – Alas, I Cannot Swim
This week I am listening to “Alas, I Cannot Swim” by Laura Marling
Read – Regular Expressions Cookbook
Today I finished reading “Regular Expressions Cookbook” by Jan Goyvaerts
Paper – Camera motion estimation through planar deformation determination
Today I read a paper titled “Camera motion estimation through planar deformation determination”
The abstract is:
In this paper, we propose a global method for estimating the motion of a camera which films a static scene.
Our approach is direct, fast and robust, and deals with adjacent frames of a sequence.
It is based on a quadratic approximation of the deformation between two images, in the case of a scene with constant depth in the camera coordinate system.
This condition is very restrictive but we show that provided translation and depth inverse variations are small enough, the error on optical flow involved by the approximation of depths by a constant is small.
In this context, we propose a new model of camera motion, that allows to separate the image deformation in a similarity and a “purely” projective application, due to change of optical axis direction.
This model leads to a quadratic approximation of image deformation that we estimate with an M-estimator; we can immediatly deduce camera motion parameters.
Read – The Drunkard’s Walk
Today I finished reading “The Drunkard’s Walk: How Randomness Rules Our Lives” by Leonard Mlodinow
Read – Cracking the Coding Interview
Today I finished reading “Cracking the Coding Interview: 150 Programming Questions and Solutions” by Gayle Laakmann McDowell
Listening – HAARP
This week I am listening to “HAARP” by Muse
Read – Usagi Yojimbo #22: Tomoe’s Story
Today I finished reading “Usagi Yojimbo #22: Tomoe’s Story” by Stan Sakai
Read – Discrete Mathematics Demystified
Today I finished reading “Discrete Mathematics Demystified” by Steven Krantz
Read – Tribes
Today I finished reading “Tribes: We Need You to Lead Us” by Seth Godin
Paper – Qualitative Study of a Robot Arm as a Hamiltonian System
Today I read a paper titled “Qualitative Study of a Robot Arm as a Hamiltonian System”
The abstract is:
A double pendulum subject to external torques is used as a model to study the stability of a planar manipulator with two links and two rotational driven joints.
The hamiltonian equations of motion and the fixed points (stationary solutions) in phase space are determined.
Under suitable conditions, the presence of constant torques does not change the number of fixed points, and preserves the topology of orbits in their linear neighborhoods; two equivalent invariant manifolds are observed, each corresponding to a saddle-center fixed point.
Listening – 19
This week I am listening to “19” by Adele
Read – Speak to Win
Today I finished reading “Speak to Win: How to Present with Power in Any Situation” by Brian Tracy
Paper – Agent-Based Perception of an Environment in an Emergency Situation
Today I read a paper titled “Agent-Based Perception of an Environment in an Emergency Situation”
The abstract is:
We are interested in the problem of multiagent systems development for risk detecting and emergency response in an uncertain and partially perceived environment.
The evaluation of the current situation passes by three stages inside the multiagent system.
In a first time, the situation is represented in a dynamic way.
The second step, consists to characterise the situation and finally, it is compared with other similar known situations.
In this paper, we present an information modelling of an observed environment, that we have applied on the RoboCupRescue Simulation System.
Information coming from the environment are formatted according to a taxonomy and using semantic features.
The latter are defined thanks to a fine ontology of the domain and are managed by factual agents that aim to represent dynamically the current situation.
It is part of our outrageous culture
Have we reached peak outrage yet?
Paper – Bin Packing Under Multiple Objectives – a Heuristic Approximation Approach
Today I read a paper titled “Bin Packing Under Multiple Objectives – a Heuristic Approximation Approach”
The abstract is:
The article proposes a heuristic approximation approach to the bin packing problem under multiple objectives.
In addition to the traditional objective of minimizing the number of bins, the heterogeneousness of the elements in each bin is minimized, leading to a biobjective formulation of the problem with a tradeoff between the number of bins and their heterogeneousness.
An extension of the Best-Fit approximation algorithm is presented to solve the problem.
Experimental investigations have been carried out on benchmark instances of different size, ranging from 100 to 1000 items.
Encouraging results have been obtained, showing the applicability of the heuristic approach to the described problem.
Read – Biochemistry Demystified
Today I finished reading “Biochemistry Demystified” by Sharon Walker
Read – Rich Dad’s Increase Your Financial IQ
Today I finished reading “Rich Dad’s Increase Your Financial IQ: Get Smarter with Your Money” by Robert T. Kiyosaki
Listening – Shallow Grave
This week I am listening to “Shallow Grave” by The Tallest Man On Earth
Read – Outliers
Today I finished reading “Outliers: The Story of Success” by Malcolm Gladwell
Read – Complex Variables Demystified
Today I finished reading “Complex Variables Demystified” by David McMahon
Read – Too Big to Fail
Today I finished reading “Too Big to Fail: The Inside Story of How Wall Street and Washington Fought to Save the Financial System from Crisis and Themselves” by Andrew Ross Sorkin
Paper – Predicting the Path of an Open System
Today I read a paper titled “Predicting the Path of an Open System”
The abstract is:
The expected path of an open system,which is a big Poincare system,has been found in this paper.This path has been obtained from the actual and from the expected droop of the open system.The actual droop has been reconstructed from the variations in the power and in the frequency of the open system.The expected droop has been found as a function of rotation from the expected potential energy of the open system under synchronization of that system.
Studying – Creating icons with Photoshop
This month I am studying “Creating icons with Photoshop”
Listening – Insurgentes
This week I am listening to “Insurgentes” by Steven Wilson