This week I am listening to “Futures” by Jimmy Eat World
Read – Once More* *with footnotes
Today I finished reading “Once More* *with footnotes” by Terry Pratchett
Read – The Man in the Iron Mask
Today I finished reading “The Man in the Iron Mask” by Alexandre Dumas
Paper – Business Intelligence from Web Usage Mining
Today I read a paper titled “Business Intelligence from Web Usage Mining”
The abstract is:
The rapid e-commerce growth has made both business community and customers face a new situation.
Due to intense competition on one hand and the customer’s option to choose from several alternatives business community has realized the necessity of intelligent marketing strategies and relationship management.
Web usage mining attempts to discover useful knowledge from the secondary data obtained from the interactions of the users with the Web.
Web usage mining has become very critical for effective Web site management, creating adaptive Web sites, business and support services, personalization, network traffic flow analysis and so on.
In this paper, we present the important concepts of Web usage mining and its various practical applications.
We further present a novel approach ‘intelligent-miner’ (i-Miner) to optimize the concurrent architecture of a fuzzy clustering algorithm (to discover web data clusters) and a fuzzy inference system to analyze the Web site visitor trends.
A hybrid evolutionary fuzzy clustering algorithm is proposed in this paper to optimally segregate similar user interests.
The clustered data is then used to analyze the trends using a Takagi-Sugeno fuzzy inference system learned using a combination of evolutionary algorithm and neural network learning.
Proposed approach is compared with self-organizing maps (to discover patterns) and several function approximation techniques like neural networks, linear genetic programming and Takagi-Sugeno fuzzy inference system (to analyze the clusters).
The results are graphically illustrated and the practical significance is discussed in detail.
Empirical results clearly show that the proposed Web usage-mining framework is efficient.
Paper – How many candidates are needed to make elections hard to manipulate?
Today I read a paper titled “How many candidates are needed to make elections hard to manipulate?”
The abstract is:
In multiagent settings where the agents have different preferences, preference aggregation is a central issue.
Voting is a general method for preference aggregation, but seminal results have shown that all general voting protocols are manipulable.
One could try to avoid manipulation by using voting protocols where determining a beneficial manipulation is hard computationally.
The complexity of manipulating realistic elections where the number of candidates is a small constant was recently studied (Conitzer 2002), but the emphasis was on the question of whether or not a protocol becomes hard to manipulate for some constant number of candidates.
That work, in many cases, left open the question: How many candidates are needed to make elections hard to manipulate? This is a crucial question when comparing the relative manipulability of different voting protocols.
In this paper we answer that question for the voting protocols of the earlier study: plurality, Borda, STV, Copeland, maximin, regular cup, and randomized cup.
We also answer that question for two voting protocols for which no results on the complexity of manipulation have been derived before: veto and plurality with runoff.
It turns out that the voting protocols under study become hard to manipulate at 3 candidates, 4 candidates, 7 candidates, or never.
Read – AI for Game Developers
Today I finished reading “AI for Game Developers” by David M. Bourg
Listening – Eye To The Telescope
This week I am listening to “Eye To The Telescope” by KT Tunstall
Read – The Catcher in the Rye
Today I finished reading “The Catcher in the Rye” by J.D. Salinger
Read – Pensees
Today I finished reading “Pensees” by Blaise Pascal
Listening – Talkie Walkie
This week I am listening to “Talkie Walkie” by Air
Bad games
In video games, just like in all other products of human toil, 90% of everything is crap.
90% of the games today are crap.
90% of the games in the 1980’s were crap.
The reason the games in the 1980s were so much better is that there were so many games being made (they were cheaper and quicker but no means easier) and therefore, there were more good games being made than bad games being made.
But 90% of them were still crap.
Studying – Project management
This month I am studying “Project management”
Second month where I go through month’s 5 and 6 of the course ware. Not anticipating any problems. Read a few books. Read a few studies. Take a few multiple choice tests.
Update: All done. Easy enough course. Glad I took it though. Hadn’t realised I’d forgotten some of the fundamentals over the years.
Listening – Chuck
This week I am listening to “Chuck” by Sum 41
Read – C++ Coding Standards
Today I finished reading “C++ Coding Standards: 101 Rules, Guidelines, and Best Practices” by Herb Sutter
Cringe induction
Any creative output you have has a similar trait.
Your programming, your writing, your art, your opinions on life and your old social media status updates.
Your earliest work is nothing short of cringe worthy that you will find intently embarrassing when looked at through the lens of time.
Fortunately for me, the last time I visited my parent’s house I destroyed all the floppy discs with my angst ridden creative writing so that it would never haunt me in my future career.
Listening – Uh Huh Her
This week I am listening to “Uh Huh Her” by P.J. Harvey
Paper – Benchmarking and Implementation of Probability-Based Simulations on Programmable Graphics Cards
Today I read a paper titled “Benchmarking and Implementation of Probability-Based Simulations on Programmable Graphics Cards”
The abstract is:
The latest Graphics Processing Units (GPUs) are reported to reach up to 200 billion floating point operations per second (200 Gflops) and to have price performance of 0.1 cents per M flop.
These facts raise great interest in the plausibility of extending the GPUs’ use to non-graphics applications, in particular numerical simulations on structured grids (lattice).
We review previous work on using GPUs for non-graphics applications, implement probability-based simulations on the GPU, namely the Ising and percolation models, implement vector operation benchmarks for the GPU, and finally compare the CPU’s and GPU’s performance.
A general conclusion from the results obtained is that moving computations from the CPU to the GPU is feasible, yielding good time and price performance, for certain lattice computations.
Preliminary results also show that it is feasible to use them in parallel .
Listening – Bows + Arrows
This week I am listening to “Bows + Arrows” by The Walkmen
Listening – Madvillainy
This week I am listening to “Madvillainy” by Madvillain
Read – The Richest Man in Babylon
Today I finished reading “The Richest Man in Babylon” by George S. Clason
Read – Pre-Algebra Demystified
Today I finished reading “Pre-Algebra Demystified” by Allan Bluman
Studying – Project management
This month I am studying “Project management”
Taking a refresher course in project management. Six months of work, including reading material and coursework. Thankfully I don’t have to participate in “class discussions” so I won’t waste my time there.
Update: Got through the first four months of material in the first month. Handed in my essays, did the two tests, read the required books. Ho hum. Should be able to wrap it up easily next month.
Listening – Soviet Kitsch
This week I am listening to “Soviet Kitsch” by Regina Spektor
Paper – Surface Triangulation — The Metric Approach
Today I read a paper titled “Surface Triangulation — The Metric Approach”
The abstract is:
We embark in a program of studying the problem of better approximating surfaces by triangulations(triangular meshes) by considering the approximating triangulations as finite metric spaces and the target smooth surface as their Haussdorff-Gromov limit.
This allows us to define in a more natural way the relevant elements, constants and invariants s.a.
principal directions and principal values, Gaussian and Mean curvature, etc.
By a “natural way” we mean an intrinsic, discrete, metric definitions as opposed to approximating or paraphrasing the differentiable notions.
In this way we hope to circumvent computational errors and, indeed, conceptual ones, that are often inherent to the classical, “numerical” approach.
In this first study we consider the problem of determining the Gaussian curvature of a polyhedral surface, by using the {\em embedding curvature} in the sense of Wald (and Menger).
We present two modalities of employing these definitions for the computation of Gaussian curvature.
Read – AI Game Engine Programming
Today I finished reading “AI Game Engine Programming” by Brian Schwab
Read – Going Postal
Today I finished reading “Going Postal” by Terry Pratchett
Listening – Greatest Hits
This week I am listening to “Greatest Hits” by Neil Young
Read – Game Programming Golden Rules
Today I finished reading “Game Programming Golden Rules” by Martin Brownlow
Read – Electronics Demystified
Today I finished reading “Electronics Demystified” by Stan Gibilisco
Listening – Funeral
This week I am listening to “Funeral” by Arcade Fire
Read – Game Programming Gems 4
Today I finished reading “Game Programming Gems 4” by Andrew Kirmse
Paper – Classes of service under perfect competition and technological change: a model for the dynamics of the Internet?
Today I read a paper titled “Classes of service under perfect competition and technological change: a model for the dynamics of the Internet?”
The abstract is:
Certain services may be provided in a continuous, one-dimensional, ordered range of different qualities and a customer requiring a service of quality q can only be offered a quality superior or equal to q.
Only a discrete set of different qualities will be offered, and a service provider will provide the same service (of fixed quality b) to all customers requesting qualities of service inferior or equal to b.
Assuming all services (of quality b) are priced identically, a monopolist will choose the qualities of service and the prices that maximize profit but, under perfect competition, a service provider will choose the (inferior) quality of service that can be priced at the lowest price.
Assuming significant economies of scale, two fundamentally different regimes are possible: either a number of different classes of service are offered (DC regime), or a unique class of service offers an unbounded quality of service (UC regime).
The DC regime appears in one of two sub-regimes: one, BDC, in which a finite number of classes is offered, the qualities of service offered are bounded and requests for high-quality services are not met, or UDC in which an infinite number of classes of service are offered and every request is met.
The types of the demand curve and of the economies of scale, not the pace of technological change, determine the regime and the class boundaries.
The price structure in the DC regime obeys very general laws..
Read – A Brief History of Everything
Today I finished reading “A Brief History of Everything” by Ken Wilber
Listening – Love. Angel. Music. Baby.
This week I am listening to “Love. Angel. Music. Baby.” by Gwen Stefani
Paper – Least squares fitting of circles and lines
Today I read a paper titled “Least squares fitting of circles and lines”
The abstract is:
We study theoretical and computational aspects of the least squares fit (LSF) of circles and circular arcs.
First we discuss the existence and uniqueness of LSF and various parametrization schemes.
Then we evaluate several popular circle fitting algorithms and propose a new one that surpasses the existing methods in reliability.
We also discuss and compare direct (algebraic) circle fits.
Read – Oliver Twist
Today I finished reading “Oliver Twist” by Charles Dickens
Paper – An effective Procedure for Speeding up Algorithms
Today I read a paper titled “An effective Procedure for Speeding up Algorithms”
The abstract is:
The provably asymptotically fastest algorithm within a factor of 5 for formally described problems will be constructed.
The main idea is to enumerate all programs provably equivalent to the original problem by enumerating all proofs.
The algorithm could be interpreted as a generalization and improvement of Levin search, which is, within a multiplicative constant, the fastest algorithm for inverting functions.
Blum’s speed-up theorem is avoided by taking into account only programs for which a correctness proof exists.
Furthermore, it is shown that the fastest program that computes a certain function is also one of the shortest programs provably computing this function.
To quantify this statement, the definition of Kolmogorov complexity is extended, and two new natural measures for the complexity of a function are defined.
Read – Cannery Row
Today I finished reading “Cannery Row” by John Steinbeck
Listening – Aha Shake Heartbreak
This week I am listening to “Aha Shake Heartbreak” by Kings Of Leon
Paper – The Guidebook, the Friend, and the Room: Visitor Experience in a Historic House
Today I read a paper titled “The Guidebook, the Friend, and the Room: Visitor Experience in a Historic House”
The abstract is:
In this paper, we describe an electronic guidebook prototype and report on a study of its use in a historic house.
Supported by mechanisms in the guidebook, visitors constructed experiences that had a high degree of interaction with three entities: the guidebook, their companions, and the house and its contents.
For example, we found that most visitors played audio descriptions played through speakers (rather than using headphones or reading textual descriptions) to facilitate communication with their companions.
Read – Usagi Yojimbo #18: Travels with Jotaro
Today I finished reading “Usagi Yojimbo #18: Travels with Jotaro” by Stan Sakai
Studying – Design aesthetics for web design
This month I am studying “Design aesthetics for web design”
Listening – American Idiot
This week I am listening to “American Idiot” by Green Day
Read – Rich Dad’s Retire Young, Retire Rich
Today I finished reading “Rich Dad’s Retire Young, Retire Rich: How to Get Rich Quickly and Stay Rich Forever!” by Robert T. Kiyosaki
Read – Dungeons & Dragons: Monster Manual
Today I finished reading “Dungeons & Dragons: Monster Manual” by Skip Williams
Listening – Feast Of Wire
This week I am listening to “Feast Of Wire” by Calexico
Read – The Surgeon’s Daughter
Today I finished reading “The Surgeon’s Daughter” by Walter Scott
Listening – Room On Fire
This week I am listening to “Room On Fire” by The Strokes
Paper – Neuro Fuzzy Systems: State-of-the-Art Modeling Techniques
Today I read a paper titled “Neuro Fuzzy Systems: Sate-of-the-Art Modeling Techniques”
The abstract is:
Fusion of Artificial Neural Networks (ANN) and Fuzzy Inference Systems (FIS) have attracted the growing interest of researchers in various scientific and engineering areas due to the growing need of adaptive intelligent systems to solve the real world problems.
ANN learns from scratch by adjusting the interconnections between layers.
FIS is a popular computing framework based on the concept of fuzzy set theory, fuzzy if-then rules, and fuzzy reasoning.
The advantages of a combination of ANN and FIS are obvious.
There are several approaches to integrate ANN and FIS and very often it depends on the application.
We broadly classify the integration of ANN and FIS into three categories namely concurrent model, cooperative model and fully fused model.
This paper starts with a discussion of the features of each model and generalize the advantages and deficiencies of each model.
We further focus the review on the different types of fused neuro-fuzzy systems and citing the advantages and disadvantages of each model.
Read – The Trigger
Today I finished reading “The Trigger” by Arthur C. Clarke