This week I am listening to “Return Of Saturn” by No Doubt
Paper – Syntactic-Head-Driven Generation
Today I read a paper titled “Syntactic-Head-Driven Generation”
The abstract is:
The previously proposed semantic-head-driven generation methods run into problems if none of the daughter constituents in the syntacto-semantic rule schemata of a grammar fits the definition of a semantic head given in Shieber et al.
1990.
This is the case for the semantic analysis rules of certain constraint-based semantic representations, e.g.
Underspecified Discourse Representation Structures (UDRSs) (Frank/Reyle 1992).
Since head-driven generation in general has its merits, we simply return to a syntactic definition of `head’ and demonstrate the feasibility of syntactic-head-driven generation.
In addition to its generality, a syntactic-head-driven algorithm provides a basis for a logically well-defined treatment of the movement of (syntactic) heads, for which only ad-hoc solutions existed, so far..
Paper – Complexity limitations on quantum computation
Today I read a paper titled “Complexity limitations on quantum computation”
The abstract is:
We use the powerful tools of counting complexity and generic oracles to help understand the limitations of the complexity of quantum computation.
We show several results for the probabilistic quantum class BQP.
1.
BQP is low for PP, i.e., PP^BQP=PP.
2.
There exists a relativized world where P=BQP and the polynomial-time hierarchy is infinite.
3.
There exists a relativized world where BQP does not have complete sets.
4.
There exists a relativized world where P=BQP but P is not equal to UP intersect coUP and one-way functions exist.
This gives a relativized answer to an open question of Simon.
Read – Travels with Charley: In Search of America
Today I finished reading “Travels with Charley: In Search of America” by John Steinbeck
Read – Winter in Eden
Today I finished reading “Winter in Eden” by Harry Harrison
Read – Antipatterns
Today I finished reading “Antipatterns: Refactoring Software, Architectures, and Projects in Crisis” by William J. Brown
Read – Extreme Programming Installed
Today I finished reading “Extreme Programming Installed” by Ron Jeffries
Listening – V: The New Mythology Suite
This week I am listening to “V: The New Mythology Suite” by Symphony X
Read – The Celebrated Jumping Frog of Calaveras County
Today I finished reading “The Celebrated Jumping Frog of Calaveras County” by Mark Twain
Read – Dungeons & Dragons: Player’s Handbook
Today I finished reading “Dungeons & Dragons: Player’s Handbook” by Monte Cook
Listening – Building Nothing Out Of Something
This week I am listening to “Building Nothing Out Of Something” by Modest Mouse
Read – The Fast Forward MBA In Business Planning For Growth
Today I finished reading “The Fast Forward MBA In Business Planning For Growth” by Philip Walcoff
Studying – Foundations of photography
This month I am studying “Foundations of photography”
Three month refresher course in photography because I haven’t done any serious photography since 1990. Just bought a new digital camera since they have started to rival film in some aspects.
I’ve got a Konica point-and-shoot with only a few controls on it. Doesn’t do a half-decent job for snapshots.
I’ve also picked up a traditional Canon EOS 30 because I am just more comfortable with film. I suspect Canon will eventually release a digital version once the sensors get good enough.
Update: After the first class I just hired the teacher privately to take me through the full three months of study one-on-one. Much more fun.
Update #2: Lots of fieldwork. I am liking this.
Paper – Character design for soccer commmentary
Today I read a paper titled “Character design for soccer commmentary”
The abstract is:
In this paper we present early work on an animated talking head commentary system called {\bf Byrne}\footnote{David Byrne is the lead singer of the Talking Heads.}.
The goal of this project is to develop a system which can take the output from the RoboCup soccer simulator, and generate appropriate affective speech and facial expressions, based on the character’s personality, emotional state, and the state of play.
Here we describe a system which takes pre-analysed simulator output as input, and which generates text marked-up for use by a speech generator and a face animation system.
We make heavy use of inter-system standards, so that future versions of Byrne will be able to take advantage of advances in the technologies that it incorporates.
Listening – Renegades
This week I am listening to “Renegades” by Rage Against The Machine
Paper – Simple and Superlattice Turing Patterns in Reaction-Diffusion Systems: Bifurcation, Bistability, and Parameter Collapse
Today I read a paper titled “Simple and Superlattice Turing Patterns in Reaction-Diffusion Systems: Bifurcation, Bistability, and Parameter Collapse”
The abstract is:
This paper investigates the competition between both simple (e.g.
stripes, hexagons) and “superlattice” (super squares, super hexagons) Turing patterns in two-component reaction-diffusion systems.
“Superlattice” patterns are formed from eight or twelve Fourier modes, and feature structure at two different length scales.
Using perturbation theory, we derive simple analytical expressions for the bifurcation equation coefficients on both rhombic and hexagonal lattices.
These expressions show that, no matter how complicated the reaction kinectics, the nonlinear reaction terms reduce to just four effective terms within the bifurcation equation coefficients.
Moreover, at the hexagonal degeneracy — when the quadratic term in the hexagonal bifurcation equation disappears — the number of effective system parameters drops to two, allowing a complete characterization of the possible bifurcation results at this degeneracy.
The general results are then applied to specific model equations, to investigate the stability of different patterns within those models.
Listening – Relationship Of Command
This week I am listening to “Relationship Of Command” by At The Drive-In
Read – Magic
Today I finished reading “Magic” by Isaac Asimov
Read – Every Living Thing
Today I finished reading “Every Living Thing” by James Herriot
Listening – Music
This week I am listening to “Music” by Madonna
Read – The October Country
Today I finished reading “The October Country” by Ray Bradbury
Listening – The Hour Of Bewilderbeast
This week I am listening to “The Hour Of Bewilderbeast” by Badly Drawn Boy
Paper – Cached Sufficient Statistics for Efficient Machine Learning with Large Datasets
Today I read a paper titled “Cached Sufficient Statistics for Efficient Machine Learning with Large Datasets”
The abstract is:
This paper introduces new algorithms and data structures for quick counting for machine learning datasets.
We focus on the counting task of constructing contingency tables, but our approach is also applicable to counting the number of records in a dataset that match conjunctive queries.
Subject to certain assumptions, the costs of these operations can be shown to be independent of the number of records in the dataset and loglinear in the number of non-zero entries in the contingency table.
We provide a very sparse data structure, the ADtree, to minimize memory use.
We provide analytical worst-case bounds for this structure for several models of data distribution.
We empirically demonstrate that tractably-sized data structures can be produced for large real-world datasets by (a) using a sparse tree structure that never allocates memory for counts of zero, (b) never allocating memory for counts that can be deduced from other counts, and (c) not bothering to expand the tree fully near its leaves.
We show how the ADtree can be used to accelerate Bayes net structure finding algorithms, rule learning algorithms, and feature selection algorithms, and we provide a number of empirical results comparing ADtree methods against traditional direct counting approaches.
We also discuss the possible uses of ADtrees in other machine learning methods, and discuss the merits of ADtrees in comparison with alternative representations such as kd-trees, R-trees and Frequent Sets.
Read – Tarzan of the Apes
Today I finished reading “Tarzan of the Apes” by Edgar Rice Burroughs
Read – Little Lord Fauntleroy
Today I finished reading “Little Lord Fauntleroy” by Frances Hodgson Burnett
Studying – Developing websites with PHP
This month I am studying “Developing websites with PHP”
It’s an online class. Coursework looks stupid easy. I cannot imagine it will take me three months to get through this.
Update: Yep, done inside of two weeks. Handed in my final course week at end of week three. Teacher seemed surprised.
This is why I love self-paced, on-line classes where you just have to interact with the teacher if you get stuck, or need a clarification.
Listening – Tourist
This week I am listening to “Tourist” by St. Germain
Roll up your sleeves
E-ink tattoos will eventually happen.
That’s a given.
With e-ink you will be able to have a full-sleeve tattoo, or a facial tattoo, change the design whenever you feel like, and switch it off when you have to go to work at an office that has a dress-code of “no visible tattoos.”
Read – Small Memory Software
Today I finished reading “Small Memory Software: Patterns for Systems with Limited Memory” by James Noble
eInk Tattoos
Why aren’t tattoos made out of eInk?
Then you wouldn’t have to pick any one design.
You could just download any design you wanted.
So tattoo parlours would then become just a place where you go to get your skin impregnated with eInk and perhaps download new designs.
Paper – Finite-resolution hidden surface removal
Today I read a paper titled “Finite-resolution hidden surface removal”
The abstract is:
We propose a hybrid image-space/object-space solution to the classical hidden surface removal problem: Given n disjoint triangles in Real^3 and p sample points (“pixels”) in the xy-plane, determine the first triangle directly behind each pixel.
Our algorithm constructs the sampled visibility map of the triangles with respect to the pixels, which is the subset of the trapezoids in a trapezoidal decomposition of the analytic visibility map that contain at least one pixel.
The sampled visibility map adapts to local changes in image complexity, and its complexity is bounded both by the number of pixels and by the complexity of the analytic visibility map.
Our algorithm runs in time O(n^{1+e} + n^{2/3+e}t^{2/3} + p), where t is the output size and e is any positive constant.
This is nearly optimal in the worst case and compares favorably with the best output-sensitive algorithms for both ray casting and analytic hidden surface removal.
In the special case where the pixels form a regular grid, a sweepline variant of our algorithm runs in time O(n^{1+e} + n^{2/3+e}t^{2/3} + t log p), which is usually sublinear in the number of pixels..
Paper – Practical algorithms for on-line sampling
Today I read a paper titled “Practical algorithms for on-line sampling”
The abstract is:
One of the core applications of machine learning to knowledge discovery consists on building a function (a hypothesis) from a given amount of data (for instance a decision tree or a neural network) such that we can use it afterwards to predict new instances of the data.
In this paper, we focus on a particular situation where we assume that the hypothesis we want to use for prediction is very simple, and thus, the hypotheses class is of feasible size.
We study the problem of how to determine which of the hypotheses in the class is almost the best one.
We present two on-line sampling algorithms for selecting hypotheses, give theoretical bounds on the number of necessary examples, and analize them exprimentally.
We compare them with the simple batch sampling approach commonly used and show that in most of the situations our algorithms use much fewer number of examples.
Read – Work is a Contact Sport
Today I finished reading “Work is a Contact Sport” by Scott Adams
Listening – Hours…
This week I am listening to “Hours…” by David Bowie
Read – Giant Steps
Today I finished reading “Giant Steps: Small Changes to Make a Big Difference” by Anthony Robbins
Read – Fugitive from the Cubicle Police
Today I finished reading “Fugitive from the Cubicle Police” by Scott Adams
Read – Dig Your Well before You’re Thirsty
Today I finished reading “Dig Your Well before You’re Thirsty: The only networking book you’ll ever need” by Harvey MacKay
Listening – Mule Variations
This week I am listening to “Mule Variations” by Tom Waits
Read – MUTTS
Today I finished reading “MUTTS” by Patrick McDonnell
Read – Variable Star
Today I finished reading “Variable Star” by Robert A. Heinlein
Read – Software Project Management
Today I finished reading “Software Project Management” by Bob Hughes
Read – The Anti-Christ
Today I finished reading “The Anti-Christ” by Friedrich Nietzsche
Paper – KnightCap: A chess program that learns by combining TD(lambda) with game-tree search
Today I read a paper titled “KnightCap: A chess program that learns by combining TD(lambda) with game-tree search”
The abstract is:
In this paper we present TDLeaf(lambda), a variation on the TD(lambda) algorithm that enables it to be used in conjunction with game-tree search.
We present some experiments in which our chess program “KnightCap” used TDLeaf(lambda) to learn its evaluation function while playing on the Free Internet Chess Server (FICS, fics.onenet.net).
The main success we report is that KnightCap improved from a 1650 rating to a 2150 rating in just 308 games and 3 days of play.
As a reference, a rating of 1650 corresponds to about level B human play (on a scale from E (1000) to A (1800)), while 2150 is human master level.
We discuss some of the reasons for this success, principle among them being the use of on-line, rather than self-play.
Listening – Supergrass
This week I am listening to “Supergrass” by Supergrass
Read – 3001: The Final Odyssey
Today I finished reading “3001: The Final Odyssey” by Arthur C. Clarke
Studying – Capturing the essence of caricatures
This month I am studying “Capturing the essence of caricatures”
Listening – Judgement
This week I am listening to “Judgement” by Anathema
Paper – The importance of quantum decoherence in brain processes
Today I read a paper titled “The importance of quantum decoherence in brain processes”
The abstract is:
Based on a calculation of neural decoherence rates, we argue that that the degrees of freedom of the human brain that relate to cognitive processes should be thought of as a classical rather than quantum system, i.e., that there is nothing fundamentally wrong with the current classical approach to neural network simulations.
We find that the decoherence timescales ~10^{-13}-10^{-20} seconds are typically much shorter than the relevant dynamical timescales (~0.001-0.1 seconds), both for regular neuron firing and for kink-like polarization excitations in microtubules.
This conclusion disagrees with suggestions by Penrose and others that the brain acts as a quantum computer, and that quantum coherence is related to consciousness in a fundamental way.
Paper – Relating Complexity to Practical Performance in Parsing with Wide-Coverage Unification Grammars
Today I read a paper titled “Relating Complexity to Practical Performance in Parsing with Wide-Coverage Unification Grammars”
The abstract is:
The paper demonstrates that exponential complexities with respect to grammar size and input length have little impact on the performance of three unification-based parsing algorithms, using a wide-coverage grammar.
The results imply that the study and optimisation of unification-based parsing must rely on empirical data until complexity theory can more accurately predict the practical behaviour of such parsers..
Paper – Image Compression with Iterated Function Systems, Finite Automata and Zerotrees: Grand Unification
Today I read a paper titled “Image Compression with Iterated Function Systems, Finite Automata and Zerotrees: Grand Unification”
The abstract is:
Fractal image compression, Culik’s image compression and zerotree prediction coding of wavelet image decomposition coefficients succeed only because typical images being compressed possess a significant degree of self-similarity.
Besides the common concept, these methods turn out to be even more tightly related, to the point of algorithmical reducibility of one technique to another.
The goal of the present paper is to demonstrate these relations.
The paper offers a plain-term interpretation of Culik’s image compression, in regular image processing terms, without resorting to finite state machines and similar lofty language.
The interpretation is shown to be algorithmically related to an IFS fractal image compression method: an IFS can be exactly transformed into Culik’s image code.
Using this transformation, we will prove that in a self-similar (part of an) image any zero wavelet coefficient is the root of a zerotree, or its branch.
The paper discusses the zerotree coding of (wavelet/projection) coefficients as a common predictor/corrector, applied vertically through different layers of a multiresolutional decomposition, rather than within the same view.
This interpretation leads to an insight into the evolution of image compression techniques: from a causal single-layer prediction, to non-causal same-view predictions (wavelet decomposition among others) and to a causal cross-layer prediction (zero-trees, Culik’s method).