This month I am studying “Logo design techniques”
Somebody needs to think about this stuff...
by justin
This month I am studying “Logo design techniques”
by justin
Ah clients, the bane of every contractors life.
Sat in a meeting earlier this week, where the client was trying to browbeat me in to lowering my prices and he says to me, “Well if I had your experience and computer I could just do it myself for free, so I don’t see why you need to charge so much.”
by justin
This week I am listening to “Velocity : Design : Comfort” by Sweet Trip
by justin
Today I finished reading “Ready for Anything: 52 Productivity Principles for Getting Things Done” by David Allen
by justin
This week I am listening to “The Decline Of British Sea Power” by British Sea Power
by justin
Today I read a paper titled “Algorithmic Clustering of Music”
The abstract is:
We present a fully automatic method for music classification, based only on compression of strings that represent the music pieces.
The method uses no background knowledge about music whatsoever: it is completely general and can, without change, be used in different areas like linguistic classification and genomics.
It is based on an ideal theory of the information content in individual objects (Kolmogorov complexity), information distance, and a universal similarity metric.
Experiments show that the method distinguishes reasonably well between various musical genres and can even cluster pieces by composer.
by justin
This week I am listening to “Dear Catastrophe Waitress” by Belle & Sebastian
by justin
Today I read a paper titled “The one-round Voronoi game replayed”
The abstract is:
We consider the one-round Voronoi game, where player one (“White”, called “Wilma”) places a set of n points in a rectangular area of aspect ratio r <=1, followed by the second player (``Black'', called ``Barney''), who places the same number of points.
Each player wins the fraction of the board closest to one of his points, and the goal is to win more than half of the total area.
This problem has been studied by Cheong et al., who showed that for large enough $n$ and r=1, Barney has a strategy that guarantees a fraction of 1/2+a, for some small fixed a.
We resolve a number of open problems raised by that paper.
In particular, we give a precise characterization of the outcome of the game for optimal play: We show that Barney has a winning strategy for n>2 and r>sqrt{2}/n, and for n=2 and r>sqrt{3}/2.
Wilma wins in all remaining cases, i.e., for n>=3 and r<=sqrt{2}/n, for n=2 and r<=sqrt{3}/2, and for n=1.
We also discuss complexity aspects of the game on more general boards, by proving that for a polygon with holes, it is NP-hard to maximize the area Barney can win against a given set of points by Wilma.
by justin
Today I finished reading “Chips Challenging Champions: Games, Computers and Artificial Intelligence” by J. Schaeffer
by justin
Today I finished reading “J2ME Game Programming” by Martin Wells
by justin
This week I am listening to “On And On” by Jack Johnson
by justin
Today I read a paper titled “Cyborg Systems as Platforms for Computer-Vision Algorithm-Development for Astrobiology”
The abstract is:
Employing the allegorical imagery from the film “The Matrix”, we motivate and discuss our `Cyborg Astrobiologist’ research program.
In this research program, we are using a wearable computer and video camcorder in order to test and train a computer-vision system to be a field-geologist and field-astrobiologist.
by justin
Today I finished reading “AI Game Programming Wisdom 2” by Steve Rabin
by justin
Today I finished reading “The Theory of Everything: The Origin and Fate of the Universe” by Stephen Hawking
by justin
Today I finished reading “Differential Equations Demystified” by Steven Krantz
by justin
This month I am studying “Landscape painting”
by justin
This week I am listening to “Portrait Of A Legend 1951-1964” by Sam Cooke
by justin
This week I am listening to “Sleeping With Ghosts” by Placebo
by justin
Today I finished reading “The Cricket on the Hearth” by Charles Dickens
by justin
Today I read a paper titled “The Graphics Card as a Streaming Computer”
The abstract is:
Massive data sets have radically changed our understanding of how to design efficient algorithms; the streaming paradigm, whether it in terms of number of passes of an external memory algorithm, or the single pass and limited memory of a stream algorithm, appears to be the dominant method for coping with large data.
A very different kind of massive computation has had the same effect at the level of the CPU.
The most prominent example is that of the computations performed by a graphics card.
The operations themselves are very simple, and require very little memory, but require the ability to perform many computations extremely fast and in parallel to whatever degree possible.
What has resulted is a stream processor that is highly optimized for stream computations.
An intriguing side effect of this is the growing use of a graphics card as a general purpose stream processing engine.
In an ever-increasing array of applications, researchers are discovering that performing a computation on a graphics card is far faster than performing it on a CPU, and so are using a GPU as a stream co-processor..
by justin
Today I read a paper titled “Local Community Identification through User Access Patterns”
The abstract is:
Community identification algorithms have been used to enhance the quality of the services perceived by its users.
Although algorithms for community have a widespread use in the Web, their application to portals or specific subsets of the Web has not been much studied.
In this paper, we propose a technique for local community identification that takes into account user access behavior derived from access logs of servers in the Web.
The technique takes a departure from the existing community algorithms since it changes the focus of in terest, moving from authors to users.
Our approach does not use relations imposed by authors (e.g.
hyperlinks in the case of Web pages).
It uses information derived from user accesses to a service in order to infer relationships.
The communities identified are of great interest to content providers since they can be used to improve quality of their services.
We also propose an evaluation methodology for analyzing the results obtained by the algorithm.
We present two case studies based on actual data from two services: an online bookstore and an online radio.
The case of the online radio is particularly relevant, because it emphasizes the contribution of the proposed algorithm to find out communities in an environment (i.e., streaming media service) without links, that represent the relations imposed by authors (e.g.
hyperlinks in the case of Web pages).
by justin
Today I read a paper titled “On multiple connectedness of regions visible due to multiple diffuse reflections”
The abstract is:
It is known that the region V(s) of a simple polygon P, directly visible (illuminable) from an internal point s, is simply connected.
Aronov et al.
\cite{addpp981} established that the region V1(s) of a simple polygon visible from an internal point s due to at most one diffuse reflection on the boundary of the polygon P, is also simply connected.
In this paper we establish that the region V2(s), visible from s due to at most two diffuse reflections may be multiply connected; we demonstrate the construction of an n-sided simple polygon with a point s inside it so that and the region of P visible from s after at most two diffuse reflections is multiple connected..
by justin
This week I am listening to “100th Window” by Massive Attack
by justin
Today I finished reading “The Secret: What Great Leaders Know – And Do” by Kenneth Blanchard
by justin
Today I finished reading “From the Dust Returned” by Ray Bradbury
by justin
Today I finished reading “Advanced Java Game Programming” by David Wallace Croft
by justin
Today I finished reading “Groo: Mightier Than the Sword” by Sergio Aragones
by justin
This week I am listening to “Life For Rent” by Dido
by justin
We don’t need no indirection,
We don’t need no flow control,
No strong typing or declarations,
Hey! You! Leave those ducks alone.
— Title is an historical reference
by justin
This week I am listening to “Thirteenth Step” by A Perfect Circle
by justin
Today I read a paper titled “Towards a Model-Based Framework for Integrating Usability and Software Engineering Life Cycles”
The abstract is:
In this position paper we propose a process model that provides a development infrastructure in which the usability engineering and software engineering life cycles co-exist in complementary roles.
We describe the motivation, hurdles, rationale, arguments, and implementation plan for the need, specification, and the usefulness of such a model.
by justin
This month I am studying “Motion blur, depth of field and atmospheric perspective”
by justin
This week I am listening to “Lesser Matters” by The Radio Dept.
by justin
Today I read a paper titled “Embedded Reflection Mapping”
The abstract is:
Environment maps are used to simulate reflections off curved objects.
We present a technique to reflect a user, or a group of users, in a real environment, onto a virtual object, in a virtual reality application, using the live video feeds from a set of cameras, in real-time.
Our setup can be used in a variety of environments ranging from outdoor or indoor scenes..
by justin
Today I read a paper titled “Communities of Practice: Going Virtual”
The abstract is:
With the current trends towards downsizing, outsourcing and globalisation, modern organisations are reducing the numbers of people they employ.
In addition, organisations now have to cope with the increasing internationalisation of business forcing collaboration and knowledge sharing across time and distance simultaneously.
There is a need for new ways of thinking about how knowledge is shared in distributed groups.
In this paper we explore a relatively new approach to knowledge sharing using Lave and Wenger’s (1991) theory of Communities of Practice (CoPs).
We investigate whether CoPs might translate to a geographically distributed international environment through a case study that explores the functioning of a CoP across national boundaries.
by justin
Today I finished reading “Beginning OpenGL Game Programming” by Dave Astle
by justin
Today I finished reading “The Bootstrapper’s Bible: How to Start and Build a Business with a Great Idea and (Almost) No Money” by Seth Godin
by justin
Today I finished reading “The Case of the Toxic Spell Dump” by Harry Turtledove
by justin
Today I finished reading “Rammer” by Larry Niven
by justin
Today I finished reading “Redgauntlet” by Walter Scott
by justin
This week I am listening to “War All The Time” by Thursday
by justin
Today I read a paper titled “Practical and Robust Stenciled Shadow Volumes for Hardware-Accelerated Rendering”
The abstract is:
Twenty-five years ago, Crow published the shadow volume approach for determining shadowed regions in a scene.
A decade ago, Heidmann described a hardware-accelerated stencil buffer-based shadow volume algorithm.
Unfortunately hardware-accelerated stenciled shadow volume techniques have not been widely adopted by 3D games and applications due in large part to the lack of robustness of described techniques.
This situation persists despite widely available hardware support.
Specifically what has been lacking is a technique that robustly handles various “hard” situations created by near or far plane clipping of shadow volumes.
We describe a robust, artifact-free technique for hardware-accelerated rendering of stenciled shadow volumes.
Assuming existing hardware, we resolve the issues otherwise caused by shadow volume near and far plane clipping through a combination of (1) placing the conventional far clip plane “at infinity”, (2) rasterization with infinite shadow volume polygons via homogeneous coordinates, and (3) adopting a zfail stencil-testing scheme.
Depth clamping, a new rasterization feature provided by NVIDIA’s GeForce3, preserves existing depth precision by not requiring the far plane to be placed at infinity.
We also propose two-sided stencil testing to improve the efficiency of rendering stenciled shadow volumes..
by justin
Today I read a paper titled “An evaluation of Naive Bayesian anti-spam filtering”
The abstract is:
It has recently been argued that a Naive Bayesian classifier can be used to filter unsolicited bulk e-mail (“spam”).
We conduct a thorough evaluation of this proposal on a corpus that we make publicly available, contributing towards standard benchmarks.
At the same time we investigate the effect of attribute-set size, training-corpus size, lemmatization, and stop-lists on the filter’s performance, issues that had not been previously explored.
After introducing appropriate cost-sensitive evaluation measures, we reach the conclusion that additional safety nets are needed for the Naive Bayesian anti-spam filter to be viable in practice.
by justin
Today I finished reading “Buck Godot” by Phil Foglio
by justin
This week I am listening to “Ocean Avenue” by Yellowcard
by justin
Today I read a paper titled “Enhancing a Search Algorithm to Perform Intelligent Backtracking”
The abstract is:
This paper illustrates how a Prolog program, using chronological backtracking to find a solution in some search space, can be enhanced to perform intelligent backtracking.
The enhancement crucially relies on the impurity of Prolog that allows a program to store information when a dead end is reached.
To illustrate the technique, a simple search program is enhanced.
To appear in Theory and Practice of Logic Programming.
Keywords: intelligent backtracking, dependency-directed backtracking, backjumping, conflict-directed backjumping, nogood sets, look-back.
by justin
Today I finished reading “Fruits Basket #1” by Natsuki Takaya
by justin
Today I read a paper titled “Sequence Prediction based on Monotone Complexity”
The abstract is:
This paper studies sequence prediction based on the monotone Kolmogorov complexity Km=-log m, i.e.
based on universal deterministic/one-part MDL.
m is extremely close to Solomonoff’s prior M, the latter being an excellent predictor in deterministic as well as probabilistic environments, where performance is measured in terms of convergence of posteriors or losses.
Despite this closeness to M, it is difficult to assess the prediction quality of m, since little is known about the closeness of their posteriors, which are the important quantities for prediction.
We show that for deterministic computable environments, the “posterior” and losses of m converge, but rapid convergence could only be shown on-sequence; the off-sequence behavior is unclear.
In probabilistic environments, neither the posterior nor the losses converge, in general.
by justin
Today I read a paper titled “A Game Theoretic Framework for Incentives in P2P Systems”
The abstract is:
Peer-To-Peer (P2P) networks are self-organizing, distributed systems, with no centralized authority or infrastructure.
Because of the voluntary participation, the availability of resources in a P2P system can be highly variable and unpredictable.
In this paper, we use ideas from Game Theory to study the interaction of strategic and rational peers, and propose a differential service-based incentive scheme to improve the system’s performance.
by justin
Today I finished reading “The Wee Free Men” by Terry Pratchett