This week I am listening to “The Whirlwind” by Transatlantic
Archives for 2010
Read – Earthlight & Other Stories
Today I finished reading “Earthlight & Other Stories” by Arthur C. Clarke
Paper – On Affinity Measures for Artificial Immune System Movie Recommenders
Today I read a paper titled “On Affinity Measures for Artificial Immune System Movie Recommenders”
The abstract is:
We combine Artificial Immune Systems ‘AIS’, technology with Collaborative Filtering ‘CF’ and use it to build a movie recommendation system.
We already know that Artificial Immune Systems work well as movie recommenders from previous work by Cayzer and Aickelin 3, 4, 5.
Here our aim is to investigate the effect of different affinity measure algorithms for the AIS.
Two different affinity measures, Kendalls Tau and Weighted Kappa, are used to calculate the correlation coefficients for the movie recommender.
We compare the results with those published previously and show that Weighted Kappa is more suitable than others for movie problems.
We also show that AIS are generally robust movie recommenders and that, as long as a suitable affinity measure is chosen, results are good.
Listening – Daisy
This week I am listening to “Daisy” by Brand New
Paper – Camera distortion self-calibration using the plumb-line constraint and minimal Hough entropy
Today I read a paper titled “Camera distortion self-calibration using the plumb-line constraint and minimal Hough entropy”
The abstract is:
In this paper we present a simple and robust method for self-correction of camera distortion using single images of scenes which contain straight lines.
Since the most common distortion can be modelled as radial distortion, we illustrate the method using the Harris radial distortion model, but the method is applicable to any distortion model.
The method is based on transforming the edgels of the distorted image to a 1-D angular Hough space, and optimizing the distortion correction parameters which minimize the entropy of the corresponding normalized histogram.
Properly corrected imagery will have fewer curved lines, and therefore less spread in Hough space.
Since the method does not rely on any image structure beyond the existence of edgels sharing some common orientations and does not use edge fitting, it is applicable to a wide variety of image types.
For instance, it can be applied equally well to images of texture with weak but dominant orientations, or images with strong vanishing points.
Finally, the method is performed on both synthetic and real data revealing that it is particularly robust to noise.
Read – How to Feel Confident
Today I finished reading “How to Feel Confident: Simple Tools for Instant Success” by Leil Lowndes
Paper – Personalizing Image Search Results on Flickr
Today I read a paper titled “Personalizing Image Search Results on Flickr”
The abstract is:
The social media site Flickr allows users to upload their photos, annotate them with tags, submit them to groups, and also to form social networks by adding other users as contacts.
Flickr offers multiple ways of browsing or searching it.
One option is tag search, which returns all images tagged with a specific keyword.
If the keyword is ambiguous, e.g., “beetle” could mean an insect or a car, tag search results will include many images that are not relevant to the sense the user had in mind when executing the query.
We claim that users express their photography interests through the metadata they add in the form of contacts and image annotations.
We show how to exploit this metadata to personalize search results for the user, thereby improving search performance.
First, we show that we can significantly improve search precision by filtering tag search results by user’s contacts or a larger social network that includes those contact’s contacts.
Secondly, we describe a probabilistic model that takes advantage of tag information to discover latent topics contained in the search results.
The users’ interests can similarly be described by the tags they used for annotating their images.
The latent topics found by the model are then used to personalize search results by finding images on topics that are of interest to the user.
Listening – Veckatimest
This week I am listening to “Veckatimest” by Grizzly Bear
Paper – The Semiotic Machine
Today I read a paper titled “The Semiotic Machine”
The abstract is:
A semiotic model of the user interface in human-computer interaction.
Paper – Player co-modelling in a strategy board game: discovering how to play fast
Today I read a paper titled “Player co-modelling in a strategy board game: discovering how to play fast”
The abstract is:
In this paper we experiment with a 2-player strategy board game where playing models are evolved using reinforcement learning and neural networks.
The models are evolved to speed up automatic game development based on human involvement at varying levels of sophistication and density when compared to fully autonomous playing.
The experimental results suggest a clear and measurable association between the ability to win games and the ability to do that fast, while at the same time demonstrating that there is a minimum level of human involvement beyond which no learning really occurs.
Read – Perfect Phrases for the Sales Call
Today I finished reading “Perfect Phrases for the Sales Call” by William T. Brooks
Studying – Recolorizing photographs with Photoshop
This month I am studying “Recolorizing photographs with Photoshop”
Listening – Primary Colours
This week I am listening to “Primary Colours” by The Horrors
Paper – A Library-Based Synthesis Methodology for Reversible Logic
Today I read a paper titled “A Library-Based Synthesis Methodology for Reversible Logic”
The abstract is:
In this paper, a library-based synthesis methodology for reversible circuits is proposed where a reversible specification is considered as a permutation comprising a set of cycles.
To this end, a pre-synthesis optimization step is introduced to construct a reversible specification from an irreversible function.
In addition, a cycle-based representation model is presented to be used as an intermediate format in the proposed synthesis methodology.
The selected intermediate format serves as a focal point for all potential representation models.
In order to synthesize a given function, a library containing seven building blocks is used where each building block is a cycle of length less than 6.
To synthesize large cycles, we also propose a decomposition algorithm which produces all possible minimal and inequivalent factorizations for a given cycle of length greater than 5.
All decompositions contain the maximum number of disjoint cycles.
The generated decompositions are used in conjunction with a novel cycle assignment algorithm which is proposed based on the graph matching problem to select the best possible cycle pairs.
Then, each pair is synthesized by using the available components of the library.
The decomposition algorithm together with the cycle assignment method are considered as a binding method which selects a building block from the library for each cycle.
Finally, a post-synthesis optimization step is introduced to optimize the synthesis results in terms of different costs.
Paper – Performance Bounds for Lambda Policy Iteration and Application to the Game of Tetris
Today I read a paper titled “Performance Bounds for Lambda Policy Iteration and Application to the Game of Tetris”
The abstract is:
We consider the discrete-time infinite-horizon optimal control problem formalized by Markov Decision Processes.
We revisit the work of Bertsekas and Ioffe, that introduced $\lambda$ Policy Iteration, a family of algorithms parameterized by $\lambda$ that generalizes the standard algorithms Value Iteration and Policy Iteration, and has some deep connections with the Temporal Differences algorithm TD($\lambda$) described by Sutton and Barto.
We deepen the original theory developped by the authors by providing convergence rate bounds which generalize standard bounds for Value Iteration described for instance by Puterman.
Then, the main contribution of this paper is to develop the theory of this algorithm when it is used in an approximate form and show that this is sound.
Doing so, we extend and unify the separate analyses developped by Munos for Approximate Value Iteration and Approximate Policy Iteration.
Eventually, we revisit the use of this algorithm in the training of a Tetris playing controller as originally done by Bertsekas and Ioffe.
We provide an original performance bound that can be applied to such an undiscounted control problem.
Our empirical results are different from those of Bertsekas and Ioffe (which were originally qualified as “paradoxical” and “intriguing”), and much more conform to what one would expect from a learning experiment.
We discuss the possible reason for such a difference.
Read – Perfect Phrases for Communicating Change
Today I finished reading “Perfect Phrases for Communicating Change” by Lawrence Polsky
Read – I Shall Wear Midnight
Today I finished reading “I Shall Wear Midnight” by Terry Pratchett
Paper – Wavelet Based Iterative Learning Control with Fuzzy PD Feedback for Position Tracking of A Pneumatic Servo System
Today I read a paper titled “Wavelet Based Iterative Learning Control with Fuzzy PD Feedback for Position Tracking of A Pneumatic Servo System”
The abstract is:
In this paper, a wavelet-based iterative learning control (WILC) scheme with Fuzzy PD feedback is presented for a pneumatic control system with nonsmooth nonlinearities and uncertain parameters.
The wavelet transform is employed to extract the learnable dynamics from measured output signal before it can be used to update the control profile.
The wavelet transform is adopted to decompose the original signal into many low-resolution signals that contain the learnable and unlearnable parts.
The desired control profile is then compared with the learnable part of the transformed signal.
Thus, the effects from unlearnable dynamics on the controlled system can be attenuated by a Fuzzy PD feedback controller.
As for the rules of Fuzzy PD controller in the feedback loop, a genetic algorithm (GA) is employed to search for the inference rules of optimization.
A proportional-valve controlled pneumatic cylinder actuator system is used as the control target for simulation.
Simulation results have shown a much-improved positiontracking performance.
Listening – Cosmic Egg
This week I am listening to “Cosmic Egg” by Wolfmother
Paper – Laser Actuated Presentation System
Today I read a paper titled “Laser Actuated Presentation System”
The abstract is:
We present here a pattern sensitive PowerPoint presentation scheme.
The presentation is actuated by simple patterns drawn on the presentation screen by a laser pointer.
A specific pattern corresponds to a particular command required to operate the presentation.
Laser spot on the screen is captured by a RGB webcam with a red filter mounted, and its location is identified at the blue layer of each captured frame by estimating the mean position of the pixels whose intensity is above a given threshold value.
Measured Reliability, Accuracy and Latency of our system are 90%, 10 pixels (in the worst case) and 38 ms respectively.
Paper – Fitness landscape of the cellular automata majority problem: View from the Olympus
Today I read a paper titled “Fitness landscape of the cellular automata majority problem: View from the Olympus”
The abstract is:
In this paper we study cellular automata (CAs) that perform the computational Majority task.
This task is a good example of what the phenomenon of emergence in complex systems is.
We take an interest in the reasons that make this particular fitness landscape a difficult one.
The first goal is to study the landscape as such, and thus it is ideally independent from the actual heuristics used to search the space.
However, a second goal is to understand the features a good search technique for this particular problem space should possess.
We statistically quantify in various ways the degree of difficulty of searching this landscape.
Due to neutrality, investigations based on sampling techniques on the whole landscape are difficult to conduct.
So, we go exploring the landscape from the top.
Although it has been proved that no CA can perform the task perfectly, several efficient CAs for this task have been found.
Exploiting similarities between these CAs and symmetries in the landscape, we define the Olympus landscape which is regarded as the ”heavenly home” of the best local optima known (blok).
Then we measure several properties of this subspace.
Although it is easier to find relevant CAs in this subspace than in the overall landscape, there are structural reasons that prevent a searcher from finding overfitted CAs in the Olympus.
Finally, we study dynamics and performance of genetic algorithms on the Olympus in order to confirm our analysis and to find efficient CAs for the Majority problem with low computational cost.
Listening – Logos
This week I am listening to “Logos” by Atlas Sound
Read – Usagi Yojimbo: Yokai
Today I finished reading “Usagi Yojimbo: Yokai ” by Stan Sakai
Paper – Finding Cliques of a Graph using Prime Numbers
Today I read a paper titled “Finding Cliques of a Graph using Prime Numbers”
The abstract is:
This paper proposes a new algorithm for solving maximal cliques for simple undirected graphs using the theory of prime numbers.
A novel approach using prime numbers is used to find cliques and ends with a discussion of the algorithm.
Paper – Algorithms for Image Analysis and Combination of Pattern Classifiers with Application to Medical Diagnosis
Today I read a paper titled “Algorithms for Image Analysis and Combination of Pattern Classifiers with Application to Medical Diagnosis”
The abstract is:
Medical Informatics and the application of modern signal processing in the assistance of the diagnostic process in medical imaging is one of the more recent and active research areas today.
This thesis addresses a variety of issues related to the general problem of medical image analysis, specifically in mammography, and presents a series of algorithms and design approaches for all the intermediate levels of a modern system for computer-aided diagnosis (CAD).
The diagnostic problem is analyzed with a systematic approach, first defining the imaging characteristics and features that are relevant to probable pathology in mammo-grams.
Next, these features are quantified and fused into new, integrated radio-logical systems that exhibit embedded digital signal processing, in order to improve the final result and minimize the radiological dose for the patient.
In a higher level, special algorithms are designed for detecting and encoding these clinically interest-ing imaging features, in order to be used as input to advanced pattern classifiers and machine learning models.
Finally, these approaches are extended in multi-classifier models under the scope of Game Theory and optimum collective deci-sion, in order to produce efficient solutions for combining classifiers with minimum computational costs for advanced diagnostic systems.
The material covered in this thesis is related to a total of 18 published papers, 6 in scientific journals and 12 in international conferences.
Listening – Horehound
This week I am listening to “Horehound” by The Dead Weather
Read – The Feynman Lectures on Physics Vol 7
Today I finished reading “The Feynman Lectures on Physics Vol 7” by Richard Feynman
Studying – Autodesk Maya scripting with MEL
This month I am studying “Autodesk Maya scripting with MEL”
Listening – To Lose My Life
This week I am listening to “To Lose My Life” by White Lies
Paper – Region-based active contour with noise and shape priors
Today I read a paper titled “Region-based active contour with noise and shape priors”
The abstract is:
In this paper, we propose to combine formally noise and shape priors in region-based active contours.
On the one hand, we use the general framework of exponential family as a prior model for noise.
On the other hand, translation and scale invariant Legendre moments are considered to incorporate the shape prior (e.g.
fidelity to a reference shape).
The combination of the two prior terms in the active contour functional yields the final evolution equation whose evolution speed is rigorously derived using shape derivative tools.
Experimental results on both synthetic images and real life cardiac echography data clearly demonstrate the robustness to initialization and noise, flexibility and large potential applicability of our segmentation algorithm.
Paper – Generalized Kernel-based Visual Tracking
Today I read a paper titled “Generalized Kernel-based Visual Tracking”
The abstract is:
In this work we generalize the plain MS trackers and attempt to overcome standard mean shift trackers’ two limitations.
It is well known that modeling and maintaining a representation of a target object is an important component of a successful visual tracker.
However, little work has been done on building a robust template model for kernel-based MS tracking.
In contrast to building a template from a single frame, we train a robust object representation model from a large amount of data.
Tracking is viewed as a binary classification problem, and a discriminative classification rule is learned to distinguish between the object and background.
We adopt a support vector machine (SVM) for training.
The tracker is then implemented by maximizing the classification score.
An iterative optimization scheme very similar to MS is derived for this purpose.
Read – The Feynman Lectures on Physics Vol 12
Today I finished reading “The Feynman Lectures on Physics Vol 12” by Richard Feynman
Read – The Adventure of Charles Augustus Milverton
Today I finished reading “The Adventure of Charles Augustus Milverton” by Arthur Conan Doyle
Listening – The Hazards Of Love
This week I am listening to “The Hazards Of Love” by The Decemberists
Read – The Law of Success, Volume II: Principles of Personal Power
Today I finished reading “The Law of Success, Volume II: Principles of Personal Power” by Napoleon Hill
Read – The Accidental Time Machine
Today I finished reading “The Accidental Time Machine” by Joe Haldeman
Paper – Variations of the Turing Test in the Age of Internet and Virtual Reality
Today I read a paper titled “Variations of the Turing Test in the Age of Internet and Virtual Reality”
The abstract is:
Inspired by Hofstadter’s Coffee-House Conversation (1982) and by the science fiction short story SAM by Schattschneider (1988), we propose and discuss criteria for non-mechanical intelligence.
Firstly, we emphasize the practical need for such tests in view of massively multiuser online role-playing games (MMORPGs) and virtual reality systems like Second Life.
Secondly, we demonstrate Second Life as a useful framework for implementing (some iterations of) that test.
Listening – Tonight: Franz Ferdinand
This week I am listening to “Tonight: Franz Ferdinand” by Franz Ferdinand
Paper – Perfect Hashing for Data Management Applications
Today I read a paper titled “Perfect Hashing for Data Management Applications”
The abstract is:
Perfect hash functions can potentially be used to compress data in connection with a variety of data management tasks.
Though there has been considerable work on how to construct good perfect hash functions, there is a gap between theory and practice among all previous methods on minimal perfect hashing.
On one side, there are good theoretical results without experimentally proven practicality for large key sets.
On the other side, there are the theoretically analyzed time and space usage algorithms that assume that truly random hash functions are available for free, which is an unrealistic assumption.
In this paper we attempt to bridge this gap between theory and practice, using a number of techniques from the literature to obtain a novel scheme that is theoretically well-understood and at the same time achieves an order-of-magnitude increase in performance compared to previous “practical” methods.
This improvement comes from a combination of a novel, theoretically optimal perfect hashing scheme that greatly simplifies previous methods, and the fact that our algorithm is designed to make good use of the memory hierarchy.
We demonstrate the scalability of our algorithm by considering a set of over one billion URLs from the World Wide Web of average length 64, for which we construct a minimal perfect hash function on a commodity PC in a little more than 1 hour.
Our scheme produces minimal perfect hash functions using slightly more than 3 bits per key.
For perfect hash functions in the range $\{0,…,2n-1\}$ the space usage drops to just over 2 bits per key (i.e., one bit more than optimal for representing the key).
This is significantly below of what has been achieved previously for very large values of $n$.
Listening – Wilco (The Album)
This week I am listening to “Wilco (The Album)” by Wilco
Read – Coders at Work
Today I finished reading “Coders at Work: Reflections on the Craft of Programming” by Peter Seibel
Paper – Integration of navigation and action selection functionalities in a computational model of cortico-basal ganglia-thalamo-cortical loops
Today I read a paper titled “Integration of navigation and action selection functionalities in a computational model of cortico-basal ganglia-thalamo-cortical loops”
The abstract is:
This article describes a biomimetic control architecture affording an animat both action selection and navigation functionalities.
It satisfies the survival constraint of an artificial metabolism and supports several complementary navigation strategies.
It builds upon an action selection model based on the basal ganglia of the vertebrate brain, using two interconnected cortico-basal ganglia-thalamo-cortical loops: a ventral one concerned with appetitive actions and a dorsal one dedicated to consummatory actions.
The performances of the resulting model are evaluated in simulation.
The experiments assess the prolonged survival permitted by the use of high level navigation strategies and the complementarity of navigation strategies in dynamic environments.
The correctness of the behavioral choices in situations of antagonistic or synergetic internal states are also tested.
Finally, the modelling choices are discussed with regard to their biomimetic plausibility, while the experimental results are estimated in terms of animat adaptivity.
Read – What the Dog Saw and Other Adventures
Today I finished reading “What the Dog Saw and Other Adventures” by Malcolm Gladwell
Paper – Face Detection Using Adaboosted SVM-Based Component Classifier
Today I read a paper titled “Face Detection Using Adaboosted SVM-Based Component Classifier”
The abstract is:
Recently, Adaboost has been widely used to improve the accuracy of any given learning algorithm.
In this paper we focus on designing an algorithm to employ combination of Adaboost with Support Vector Machine as weak component classifiers to be used in Face Detection Task.
To obtain a set of effective SVM-weaklearner Classifier, this algorithm adaptively adjusts the kernel parameter in SVM instead of using a fixed one.
Proposed combination outperforms in generalization in comparison with SVM on imbalanced classification problem.
The proposed here method is compared, in terms of classification accuracy, to other commonly used Adaboost methods, such as Decision Trees and Neural Networks, on CMU+MIT face database.
Results indicate that the performance of the proposed method is overall superior to previous Adaboost approaches.
Read – The Mating Season
Today I finished reading “The Mating Season” by P.G. Wodehouse
Listening – The Eternal
This week I am listening to “The Eternal” by Sonic Youth
Paper – Teaching Physics Using Virtual Reality
Today I read a paper titled “Teaching Physics Using Virtual Reality”
The abstract is:
We present an investigation of game-like simulations for physics teaching.
We report on the effectiveness of the interactive simulation “Real Time Relativity” for learning special relativity.
We argue that the simulation not only enhances traditional learning, but also enables new types of learning that challenge the traditional curriculum.
The lessons drawn from this work are being applied to the development of a simulation for enhancing the learning of quantum mechanics.
Studying – Autodesk Maya
This month I am studying “Autodesk Maya”
Refresher course in the latest version of Maya
Listening – Forget And Not Slow Down
This week I am listening to “Forget And Not Slow Down” by Relient K
Paper – Robust Global Localization Using Clustered Particle Filtering
Today I read a paper titled “Robust Global Localization Using Clustered Particle Filtering”
The abstract is:
Global mobile robot localization is the problem of determining a robot’s pose in an environment, using sensor data, when the starting position is unknown.
A family of probabilistic algorithms known as Monte Carlo Localization (MCL) is currently among the most popular methods for solving this problem.
MCL algorithms represent a robot’s belief by a set of weighted samples, which approximate the posterior probability of where the robot is located by using a Bayesian formulation of the localization problem.
This article presents an extension to the MCL algorithm, which addresses its problems when localizing in highly symmetrical environments; a situation where MCL is often unable to correctly track equally probable poses for the robot.
The problem arises from the fact that sample sets in MCL often become impoverished, when samples are generated according to their posterior likelihood.
Our approach incorporates the idea of clusters of samples and modifies the proposal distribution considering the probability mass of those clusters.
Experimental results are presented that show that this new extension to the MCL algorithm successfully localizes in symmetric environments where ordinary MCL often fails.