Today I finished reading “Rich Dad’s Retire Young, Retire Rich: How to Get Rich Quickly and Stay Rich Forever!” by Robert T. Kiyosaki
Archives for 2004
Read – Dungeons & Dragons: Monster Manual
Today I finished reading “Dungeons & Dragons: Monster Manual” by Skip Williams
Listening – Feast Of Wire
This week I am listening to “Feast Of Wire” by Calexico
Read – The Surgeon’s Daughter
Today I finished reading “The Surgeon’s Daughter” by Walter Scott
Listening – Room On Fire
This week I am listening to “Room On Fire” by The Strokes
Paper – Neuro Fuzzy Systems: State-of-the-Art Modeling Techniques
Today I read a paper titled “Neuro Fuzzy Systems: Sate-of-the-Art Modeling Techniques”
The abstract is:
Fusion of Artificial Neural Networks (ANN) and Fuzzy Inference Systems (FIS) have attracted the growing interest of researchers in various scientific and engineering areas due to the growing need of adaptive intelligent systems to solve the real world problems.
ANN learns from scratch by adjusting the interconnections between layers.
FIS is a popular computing framework based on the concept of fuzzy set theory, fuzzy if-then rules, and fuzzy reasoning.
The advantages of a combination of ANN and FIS are obvious.
There are several approaches to integrate ANN and FIS and very often it depends on the application.
We broadly classify the integration of ANN and FIS into three categories namely concurrent model, cooperative model and fully fused model.
This paper starts with a discussion of the features of each model and generalize the advantages and deficiencies of each model.
We further focus the review on the different types of fused neuro-fuzzy systems and citing the advantages and disadvantages of each model.
Read – The Trigger
Today I finished reading “The Trigger” by Arthur C. Clarke
Paper – Expected Qualitative Utility Maximization
Today I read a paper titled “Expected Qualitative Utility Maximization”
The abstract is:
model for decision making that generalizes Expected Utility Maximization is presented.
This model, Expected Qualitative Utility Maximization, encompasses the Maximin criterion.
It relaxes both the Independence and the Continuity postulates.
Its main ingredient is the definition of a qualitative order on nonstandard models of the real numbers and the consideration of nonstandard utilities.
Expected Qualitative Utility Maximization is characterized by an original weakening of von Neumann-Morgenstern’s postulates.
Subjective probabilities may be defined from those weakened postulates, as Anscombe and Aumann did from the original postulates.
Subjective probabilities are numbers, not matrices as in the Subjective Expected Lexicographic Utility approach..
Listening – Guitar Romantic
This week I am listening to “Guitar Romantic” by The Exploding Hearts
Paper – Improved Heterogeneous Distance Functions
Today I read a paper titled “Improved Heterogeneous Distance Functions”
The abstract is:
Instance-based learning techniques typically handle continuous and linear input values well, but often do not handle nominal input attributes appropriately.
The Value Difference Metric (VDM) was designed to find reasonable distance values between nominal attribute values, but it largely ignores continuous attributes, requiring discretization to map continuous values into nominal values.
This paper proposes three new heterogeneous distance functions, called the Heterogeneous Value Difference Metric (HVDM), the Interpolated Value Difference Metric (IVDM), and the Windowed Value Difference Metric (WVDM).
These new distance functions are designed to handle applications with nominal attributes, continuous attributes, or both.
In experiments on 48 applications the new distance metrics achieve higher classification accuracy on average than three previous distance functions on those datasets that have both nominal and continuous attributes.
Paper – Fast Approximation of Centrality
Today I read a paper titled “Fast Approximation of Centrality”
The abstract is:
Social studies researchers use graphs to model group activities in social networks.
An important property in this context is the centrality of a vertex: the inverse of the average distance to each other vertex.
We describe a randomized approximation algorithm for centrality in weighted graphs.
For graphs exhibiting the small world phenomenon, our method estimates the centrality of all vertices with high probability within a (1+epsilon) factor in near-linear time.
Studying – Photo restoration techniques
This month I am studying “Photo restoration techniques”
Paper – Min-Max Fine Heaps
Today I read a paper titled “Min-Max Fine Heaps”
The abstract is:
In this paper we present a new data structure for double ended priority queue, called min-max fine heap, which combines the techniques used in fine heap and traditional min-max heap.
The standard operations on this proposed structure are also presented, and their analysis indicates that the new structure outperforms the traditional one.
Read – Don’t Make Me Think, Revisited
Today I finished reading “Don’t Make Me Think, Revisited: A Common Sense Approach to Web Usability” by Steve Krug
Listening – Transatlanticism
This week I am listening to “Transatlanticism” by Death Cab For Cutie
Read – Founders at Work
Today I finished reading “Founders at Work: Stories of Startups’ Early Days” by Jessica Livingston
Listening – Fever To Tell
This week I am listening to “Fever To Tell” by Yeah Yeah Yeahs
Read – Game Development Essentials
Today I finished reading “Game Development Essentials: An Introduction” by Jeannie Novak
Read – Sons and Lovers
Today I finished reading “Sons and Lovers” by D.H. Lawrence
Listening – Happy Songs For Happy People
This week I am listening to “Happy Songs For Happy People” by Mogwai
Read – The Universe in a Nutshell
Today I finished reading “The Universe in a Nutshell” by Stephen Hawking
Listening – Fire
This week I am listening to “Fire” by Electric Six
Studying – Brochure graphic design
This month I am studying “Brochure graphic design”
Listening – Everything Goes Numb
This week I am listening to “Everything Goes Numb” by Streetlight Manifesto
Read – The Sandman: Endless Nights
Today I finished reading “The Sandman: Endless Nights” by Neil Gaiman
Read – The Automatic Millionaire
Today I finished reading “The Automatic Millionaire: A Powerful One-Step Plan to Live and Finish Rich” by David Bach
Listening – Elephant
This week I am listening to “Elephant” by The White Stripes
Paper – A Neuro-Fuzzy Approach for Modelling Electricity Demand in Victoria
Today I read a paper titled “A Neuro-Fuzzy Approach for Modelling Electricity Demand in Victoria”
The abstract is:
Neuro-fuzzy systems have attracted growing interest of researchers in various scientific and engineering areas due to the increasing need of intelligent systems.
This paper evaluates the use of two popular soft computing techniques and conventional statistical approach based on Box–Jenkins autoregressive integrated moving average (ARIMA) model to predict electricity demand in the State of Victoria, Australia.
The soft computing methods considered are an evolving fuzzy neural network (EFuNN) and an artificial neural network (ANN) trained using scaled conjugate gradient algorithm (CGA) and backpropagation (BP) algorithm.
The forecast accuracy is compared with the forecasts used by Victorian Power Exchange (VPX) and the actual energy demand.
To evaluate, we considered load demand patterns for 10 consecutive months taken every 30 min for training the different prediction models.
Test results show that the neuro-fuzzy system performed better than neural networks, ARIMA model and the VPX forecasts.
Paper – Near Rationality and Competitive Equilibria in Networked Systems
Today I read a paper titled “Near Rationality and Competitive Equilibria in Networked Systems”
The abstract is:
A growing body of literature in networked systems research relies on game theory and mechanism design to model and address the potential lack of cooperation between self-interested users.
Most game-theoretic models applied to system research only describe competitive equilibria in terms of pure Nash equilibria, that is, a situation where the strategy of each user is deterministic, and is her best response to the strategies of all the other users.
However, the assumptions necessary for a pure Nash equilibrium to hold may be too stringent for practical systems.
Using three case studies on computer security, TCP congestion control, and network formation, we outline the limits of game-theoretic models relying on Nash equilibria, and we argue that considering competitive equilibria of a more general form may help reconcile predictions from game-theoretic models with empirically observed behavior.
Paper – Generative Programming of Graphical User Interfaces
Today I read a paper titled “Generative Programming of Graphical User Interfaces”
The abstract is:
Generative Programming (GP) is a computing paradigm allowing automatic creation of entire software families utilizing the configuration of elementary and reusable components.
GP can be projected on different technologies, e.g.
C++-templates, Java-Beans, Aspect-Oriented Programming (AOP), or Frame technology.
This paper focuses on Frame Technology, which aids the possible implementation and completion of software components.
The purpose of this paper is to introduce the GP paradigm in the area of GUI application generation.
It demonstrates how automatically customized executable applications with GUI parts can be generated from an abstract specification.
Listening – Meteora
This week I am listening to “Meteora” by Linkin Park
Equality does not mean equal
I am flabbergasted by the kerfuffle that people raise when they do not achieve an equal result when given an equal opportunity.
Equality means equal opportunity, equal access, equal treatment.
Equality does not mean “equal result.”
Paper – Optimally cutting a surface into a disk
Today I read a paper titled “Optimally cutting a surface into a disk”
The abstract is:
We consider the problem of cutting a set of edges on a polyhedral manifold surface, possibly with boundary, to obtain a single topological disk, minimizing either the total number of cut edges or their total length.
We show that this problem is NP-hard, even for manifolds without boundary and for punctured spheres.
We also describe an algorithm with running time n^{O(g+k)}, where n is the combinatorial complexity, g is the genus, and k is the number of boundary components of the input surface.
Finally, we describe a greedy algorithm that outputs a O(log^2 g)-approximation of the minimum cut graph in O(g^2 n log n) time.
Paper – Fast Universalization of Investment Strategies with Provably Good Relative Returns
Today I read a paper titled “Fast Universalization of Investment Strategies with Provably Good Relative Returns”
The abstract is:
A universalization of a parameterized investment strategy is an online algorithm whose average daily performance approaches that of the strategy operating with the optimal parameters determined offline in hindsight.
We present a general framework for universalizing investment strategies and discuss conditions under which investment strategies are universalizable.
We present examples of common investment strategies that fit into our framework.
The examples include both trading strategies that decide positions in individual stocks, and portfolio strategies that allocate wealth among multiple stocks.
This work extends Cover’s universal portfolio work.
We also discuss the runtime efficiency of universalization algorithms.
While a straightforward implementation of our algorithms runs in time exponential in the number of parameters, we show that the efficient universal portfolio computation technique of Kalai and Vempala involving the sampling of log-concave functions can be generalized to other classes of investment strategies.
Paper – Artificial Neural Networks for Beginners
Today I read a paper titled “Artificial Neural Networks for Beginners”
The abstract is:
The scope of this teaching package is to make a brief induction to Artificial Neural Networks (ANNs) for people who have no previous knowledge of them.
We first make a brief introduction to models of networks, for then describing in general terms ANNs.
As an application, we explain the backpropagation algorithm, since it is widely used and many other algorithms are derived from it.
The user should know algebra and the handling of functions and vectors.
Differential calculus is recommendable, but not necessary.
The contents of this package should be understood by people with high school education.
It would be useful for people who are just curious about what are ANNs, or for people who want to become familiar with them, so when they study them more fully, they will already have clear notions of ANNs.
Also, people who only want to apply the backpropagation algorithm without a detailed and formal explanation of it will find this material useful.
This work should not be seen as “Nets for dummies”, but of course it is not a treatise.
Much of the formality is skipped for the sake of simplicity.
Detailed explanations and demonstrations can be found in the referred readings.
The included exercises complement the understanding of the theory.
The on-line resources are highly recommended for extending this brief induction.
Listening – Train Of Thought
This week I am listening to “Train Of Thought” by Dream Theater
Read – Don’t Send a Resume
Today I finished reading “Don’t Send a Resume: And Other Contrarian Rules to Help Land a Great Job” by Jeffrey Fox
Listening – Long Gone Before Daylight
This week I am listening to “Long Gone Before Daylight” by The Cardigans
Read – Harry Potter and the Order of the Phoenix
Today I finished reading “Harry Potter and the Order of the Phoenix” by J.K. Rowling
Read – Beginner’s Guide to DarkBASIC Game Programming
Today I finished reading “Beginner’s Guide to DarkBASIC Game Programming” by Jonathan S. Harbour
Read – The League of Extraordinary Gentlemen #2
Today I finished reading “The League of Extraordinary Gentlemen #2” by Alan Moore
Ice Boots
Let’s make boots either from a substance whose surface friction property changes based on an electrical charge, or from two substances, one has very low friction and the other very high friction.
The high friction one can be disabled remotely to allow the low friction surface to come in to contact with the ground.
Why?
Because kids wear boots with little wheels in them that act like roller skates.
So how about a pair of “ice skates” on their feet instead.
These ice skates would work on almost any flat surface.
It would be like sliding around in your socks on a highly polished floor.
You could also create a set of gloves out of this.
It could well turn in to the new extreme sport.
Studying – Victorian advertising – recreating the ads of the past
This month I am studying “Victorian advertising – recreating the ads of the past”
Read – Are We Spiritual Machines?
Today I finished reading “Are We Spiritual Machines?: Ray Kurzweil vs. the Critics of Strong AI” by Ray Kurzweil
Listening – Blackout
This week I am listening to “Blackout” by Dropkick Murphys
Planned obsolescence
I whipped up a quick plugin to inject random dates for posting future updates in to my blog posts.
I needed a plugin that would pick a random date between today’s date, and a distant date in the future (about 2 years from now).
The plugin then needed to verify that the date is not already used on another post, and then schedule the new post.
I was showing it off the functionality to a friend when they point out that: “This plugin only has a range that goes up to the year 2200. You figure you’ll be dead by then?”
I thought for a moment and then replied: “No, I just figured some other technology will have replaced WordPress by then.”
Read – Game Audio Programming
Today I finished reading “Game Audio Programming” by James Boer
Paper – The Geometric Maximum Traveling Salesman Problem
Today I read a paper titled “The Geometric Maximum Traveling Salesman Problem”
The abstract is:
We consider the traveling salesman problem when the cities are points in R^d for some fixed d and distances are computed according to geometric distances, determined by some norm.
We show that for any polyhedral norm, the problem of finding a tour of maximum length can be solved in polynomial time.
If arithmetic operations are assumed to take unit time, our algorithms run in time O(n^{f-2} log n), where f is the number of facets of the polyhedron determining the polyhedral norm.
Thus for example we have O(n^2 log n) algorithms for the cases of points in the plane under the Rectilinear and Sup norms.
This is in contrast to the fact that finding a minimum length tour in each case is NP-hard.
Our approach can be extended to the more general case of quasi-norms with not necessarily symmetric unit ball, where we get a complexity of O(n^{2f-2} log n).
For the special case of two-dimensional metrics with f=4 (which includes the Rectilinear and Sup norms), we present a simple algorithm with O(n) running time.
The algorithm does not use any indirect addressing, so its running time remains valid even in comparison based models in which sorting requires Omega(n \log n) time.
The basic mechanism of the algorithm provides some intuition on why polyhedral norms allow fast algorithms.
Complementing the results on simplicity for polyhedral norms, we prove that for the case of Euclidean distances in R^d for d>2, the Maximum TSP is NP-hard.
This sheds new light on the well-studied difficulties of Euclidean distances.
Paper – Neural Network Methods for Boundary Value Problems Defined in Arbitrarily Shaped Domains
Today I read a paper titled “Neural Network Methods for Boundary Value Problems Defined in Arbitrarily Shaped Domains”
The abstract is:
Partial differential equations (PDEs) with Dirichlet boundary conditions defined on boundaries with simple geometry have been succesfuly treated using sigmoidal multilayer perceptrons in previous works.
This article deals with the case of complex boundary geometry, where the boundary is determined by a number of points that belong to it and are closely located, so as to offer a reasonable representation.
Two networks are employed: a multilayer perceptron and a radial basis function network.
The later is used to account for the satisfaction of the boundary conditions.
The method has been successfuly tested on two-dimensional and three-dimensional PDEs and has yielded accurate solutions.
Read – Pattern Recognition
Today I finished reading “Pattern Recognition” by William Gibson