Today I finished reading “The 4-Hour Chef: The Simple Path to Cooking Like a Pro, Learning Anything, and Living the Good Life” by Timothy Ferriss
Archives for 2013
Read – UX for Lean Startups
Today I finished reading “UX for Lean Startups” by Laura Klein
Listening – Light Up Gold
This week I am listening to “Light Up Gold” by Parquet Courts
Read – The Long Earth
Today I finished reading “The Long Earth” by Terry Pratchett
Paper – Social Networks and Spin Glasses
Today I read a paper titled “Social Networks and Spin Glasses”
The abstract is:
The networks formed from the links between telephones observed in a month’s call detail records (CDRs) in the UK are analyzed, looking for the characteristics thought to identify a communications network or a social network.
Some novel methods are employed.
We find similarities to both types of network.
We conclude that, just as analogies to spin glasses have proved fruitful for optimization of large scale practical problems, there will be opportunities to exploit a statistical mechanics of the formation and dynamics of social networks in today’s electronically connected world.
Paper – Entropy-based Tuning of Musical Instruments
Today I read a paper titled “Entropy-based Tuning of Musical Instruments”
The abstract is:
The human sense of hearing perceives a combination of sounds ‘in tune’ if the corresponding harmonic spectra are correlated, meaning that the neuronal excitation pattern in the inner ear exhibits some kind of order.
Based on this observation it is suggested that musical instruments such as pianos can be tuned by minimizing the Shannon entropy of suitably preprocessed Fourier spectra.
This method reproduces not only the correct stretch curve but also similar pitch fluctuations as in the case of high-quality aural tuning.
Paper – Multi-command Tactile and Auditory Brain Computer Interface based on Head Position Stimulation
Today I read a paper titled “Multi-command Tactile and Auditory Brain Computer Interface based on Head Position Stimulation”
The abstract is:
We study the extent to which vibrotactile stimuli delivered to the head of a subject can serve as a platform for a brain computer interface (BCI) paradigm.
Six head positions are used to evoke combined somatosensory and auditory (via the bone conduction effect) brain responses, in order to define a multimodal tactile and auditory brain computer interface (taBCI).
Experimental results of subjects performing online taBCI, using stimuli with a moderately fast inter-stimulus interval (ISI), validate the taBCI paradigm, while the feasibility of the concept is illuminated through information transfer rate case studies.
Paper – Acoustic Communication for Medical Nanorobots
Today I read a paper titled “Acoustic Communication for Medical Nanorobots”
The abstract is:
Communication among microscopic robots (nanorobots) can coordinate their activities for biomedical tasks.
The feasibility of in vivo ultrasonic communication is evaluated for micron-size robots broadcasting into various types of tissues.
Frequencies between 10MHz and 300MHz give the best tradeoff between efficient acoustic generation and attenuation for communication over distances of about 100 microns.
Based on these results, we find power available from ambient oxygen and glucose in the bloodstream can readily support communication rates of about 10,000 bits/second between micron-sized robots.
We discuss techniques, such as directional acoustic beams, that can increase this rate.
The acoustic pressure fields enabling this communication are unlikely to damage nearby tissue, and short bursts at considerably higher power could be of therapeutic use.
Read – The Referral Engine
Today I finished reading “The Referral Engine: Teaching Your Business to Market Itself” by John Jantsch
Listening – Some Nights
This week I am listening to “Some Nights” by fun.
Paper – An Innovative Scheme For Effectual Fingerprint Data Compression Using Bezier Curve Representations
Today I read a paper titled “An Innovative Scheme For Effectual Fingerprint Data Compression Using Bezier Curve Representations”
The abstract is:
Naturally, with the mounting application of biometric systems, there arises a difficulty in storing and handling those acquired biometric data.
Fingerprint recognition has been recognized as one of the most mature and established technique among all the biometrics systems.
In recent times, with fingerprint recognition receiving increasingly more attention the amount of fingerprints collected has been constantly creating enormous problems in storage and transmission.
Henceforth, the compression of fingerprints has emerged as an indispensable step in automated fingerprint recognition systems.
Several researchers have presented approaches for fingerprint image compression.
In this paper, we propose a novel and efficient scheme for fingerprint image compression.
The presented scheme utilizes the Bezier curve representations for effective compression of fingerprint images.
Initially, the ridges present in the fingerprint image are extracted along with their coordinate values using the approach presented.
Subsequently, the control points are determined for all the ridges by visualizing each ridge as a Bezier curve.
The control points of all the ridges determined are stored and are used to represent the fingerprint image.
When needed, the fingerprint image is reconstructed from the stored control points using Bezier curves.
The quality of the reconstructed fingerprint is determined by a formal evaluation.
The proposed scheme achieves considerable memory reduction in storing the fingerprint.
Read – Your Accomplishments Are Suspiciously Hard to Verify
Today I finished reading “Your Accomplishments Are Suspiciously Hard to Verify” by Scott Adams
Read – The Founder’s Dilemmas
Today I finished reading “The Founder’s Dilemmas: Anticipating and Avoiding the Pitfalls That Can Sink a Startup” by Noam Wasserman
Listening – Tramp
This week I am listening to “Tramp” by Sharon Van Etten
Paper – Delays Induce an Exponential Memory Gap for Rendezvous in Trees
Today I read a paper titled “Delays Induce an Exponential Memory Gap for Rendezvous in Trees”
The abstract is:
The aim of rendezvous in a graph is meeting of two mobile agents at some node of an unknown anonymous connected graph.
In this paper, we focus on rendezvous in trees, and, analogously to the efforts that have been made for solving the exploration problem with compact automata, we study the size of memory of mobile agents that permits to solve the rendezvous problem deterministically.
We assume that the agents are identical, and move in synchronous rounds.
We first show that if the delay between the starting times of the agents is arbitrary, then the lower bound on memory required for rendezvous is Omega(log n) bits, even for the line of length n.
This lower bound meets a previously known upper bound of O(log n) bits for rendezvous in arbitrary graphs of size at most n.
Our main result is a proof that the amount of memory needed for rendezvous with simultaneous start depends essentially on the number L of leaves of the tree, and is exponentially less impacted by the number n of nodes.
Indeed, we present two identical agents with O(log L + loglog n) bits of memory that solve the rendezvous problem in all trees with at most n nodes and at most L leaves.
Hence, for the class of trees with polylogarithmically many leaves, there is an exponential gap in minimum memory size needed for rendezvous between the scenario with arbitrary delay and the scenario with delay zero.
Moreover, we show that our upper bound is optimal by proving that Omega(log L + loglog n)$ bits of memory are required for rendezvous, even in the class of trees with degrees bounded by 3.
Read – Introduction to 3D Game Programming with DirectX 11
Today I finished reading “Introduction to 3D Game Programming with DirectX 11” by Frank Luna
Read – The Fourth Dimension
Today I finished reading “The Fourth Dimension: A Guided Tour of the Higher Universes” by Rudy Rucker
Watching – Lara Croft Tomb Raider: The Cradle of Life
Today I watched “Lara Croft Tomb Raider: The Cradle of Life”
Paper – A Theory of Consciousness Founded on Neurons That Behave Like Qubits
Today I read a paper titled “A Theory of Consciousness Founded on Neurons That Behave Like Qubits”
The abstract is:
This paper presents a hypothesis that consciousness is a natural result of neurons that become connected recursively, and work synchronously between short and long term memories.
Such neurons demonstrate qubit-like properties, each supporting a probabilistic combination of true and false at a given phase.
Advantages of qubits include probabilistic modifications of cues for searching associations in long term memory, and controlled toggling for parallel, reversible computations to prioritize multiple recalls and to facilitate mathematical abilities.
Paper – Nearest Neighbor Value Interpolation
Today I read a paper titled “Nearest Neighbor Value Interpolation”
The abstract is:
This paper presents the nearest neighbor value (NNV) algorithm for high resolution (H.R.) image interpolation.
The difference between the proposed algorithm and conventional nearest neighbor algorithm is that the concept applied, to estimate the missing pixel value, is guided by the nearest value rather than the distance.
In other words, the proposed concept selects one pixel, among four directly surrounding the empty location, whose value is almost equal to the value generated by the conventional bilinear interpolation algorithm.
The proposed method demonstrated higher performances in terms of H.R.
when compared to the conventional interpolation algorithms mentioned.
Listening – Put Your Back N 2 It
This week I am listening to “Put Your Back N 2 It” by Perfume Genius
Paper – Shopping Uncertainties in a Mobile and Social Context
Today I read a paper titled “Shopping Uncertainties in a Mobile and Social Context”
The abstract is:
We conducted a qualitative user study with 77 consumers to investigate what social aspects are relevant when supporting customers during their shopping activities and particularly in situations when they are undecided.
Twenty-five respondents (32%) reported seeking extra information on web pages and forums, in addition to asking their peers for advice (related to the nature of the item to be bought).
Moreover, from the remaining 52 subjects, only 6 (8%) were confident enough to make prompt comparisons between items and an immediate purchasing choice, while 17 respondents (22%) expressed the need for being away from persuasive elements.
The remaining 29 respondents (38%) reported having a suboptimal strategy for making their shopping decisions (i.e.
buying all items, not buying, or choosing randomly).
Therefore, the majority of our participants (70% = 32% + 38%) had social and information needs when making purchasing decisions.
This result motivates the development of applications that would allow consumers to ask shopping questions to their social network while on-the-go.
Read – The Startup Owner’s Manual
Today I finished reading “The Startup Owner’s Manual: The Step-By-Step Guide for Building a Great Company” by Steven Gary Blank
Paper – Q#, a quantum computation package for the .NET platform
Today I read a paper titled “Q#, a quantum computation package for the .NET platform”
The abstract is:
Quantum computing is a promising approach of computation that is based on equations from Quantum Mechanics.
A simulator for quantum algorithms must be capable of performing heavy mathematical matrix transforms.
The design of the simulator itself takes one of three forms: Quantum Turing Machine, Network Model or circuit model of connected gates or, Quantum Programming Language, yet, some simulators are hybrid.
We studied previous simulators and then we adopt features from three simulators of different implementation languages, different paradigms, and for different platforms.
They are Quantum Computing Language (QCL), QUASI, and Quantum Optics Toolbox for Matlab 5.
Our simulator for quantum algorithms takes the form of a package or a programming library for Quantum computing, with a case study showing the ability of using it in the circuit model.
The .NET is a promising platform for computing.
VB.NET is an easy, high productive programming language with the full power and functionality provided by the .NET framework.
It is highly readable, writeable, and flexible language, compared to another language such as C#.NET in many aspects.
We adopted VB.NET although its shortage in built-in mathematical complex and matrix operations, compared to Matlab.
For implementation, we first built a mathematical core of matrix operations.
Then, we built a quantum core which contains: basic qubits and register operations, basic 1D, 2D, and 3D quantum gates, and multi-view visualization of the quantum state, then a window for demos to show you how to use and get the most of the package.
Studying – Adobe InDesign tips & tricks
This month I am studying “Adobe InDesign tips & tricks”
Good time to figure out some shortcuts and hidden tricks that I maybe didn’t pick up in the previous months.
This is a short-course, just 6 hours of video, with no real exercises to speak of so I will just fill in the rest of the month with my own self-directed work.
Update: Was not as “tips & tricks” laden as I had hoped. After I got done with the 6 hours of video I went ahead and logged another 14 hours of InDesign study just going through specific design exercises.
Listening – Devotion
This week I am listening to “Devotion” by Jessie Ware
Paper – Topological Considerations for Tuning and Fingering Stringed Instruments
Today I read a paper titled “Topological Considerations for Tuning and Fingering Stringed Instruments”
The abstract is:
We present a formal language for assigning pitches to strings for fingered multi-string instruments, particularly the six-string guitar.
Given the instrument’s tuning (the strings’ open pitches) and the compass of the fingers of the hand stopping the strings, the formalism yields a framework for simultaneously optimizing three things: the mapping of pitches to strings, the choice of instrument tuning, and the key of the composition.
Final optimization relies on heuristics idiomatic to the tuning, the particular musical style, and the performer’s proficiency.
Paper – The blogosphere as an excitable social medium: Richter’s and Omori’s Law in media coverage
Today I read a paper titled “The blogosphere as an excitable social medium: Richter’s and Omori’s Law in media coverage”
The abstract is:
We study the dynamics of public media attention by monitoring the content of online blogs.
Social and media events can be traced by the propagation of word frequencies of related keywords.
Media events are classified as exogenous – where blogging activity is triggered by an external news item – or endogenous where word frequencies build up within a blogging community without external influences.
We show that word occurrences show statistical similarities to earthquakes.
The size distribution of media events follows a Gutenberg-Richter law, the dynamics of media attention before and after the media event follows Omori’s law.
We present further empirical evidence that for media events of endogenous origin the overall public reception of the event is correlated with the behavior of word frequencies at the beginning of the event, and is to a certain degree predictable.
These results may imply that the process of opinion formation in a human society might be related to effects known from excitable media.
Read – The Lean Startup
Today I finished reading “The Lean Startup: How Today’s Entrepreneurs Use Continuous Innovation to Create Radically Successful Businesses” by Eric Ries
Paper – Applying Evolutionary Optimisation to Robot Obstacle Avoidance
Today I read a paper titled “Applying Evolutionary Optimisation to Robot Obstacle Avoidance”
The abstract is:
This paper presents an artificial evolutionbased method for stereo image analysis and its application to real-time obstacle detection and avoidance for a mobile robot.
It uses the Parisian approach, which consists here in splitting the representation of the robot’s environment into a large number of simple primitives, the “flies”, which are evolved following a biologically inspired scheme and give a fast, low-cost solution to the obstacle detection problem in mobile robotics.
Paper – Exploration of Recent Advances in the Field of Brain Computer Interfaces
Today I read a paper titled “Exploration of Recent Advances in the Field of Brain Computer Interfaces”
The abstract is:
A new approach for implementing number of expressions, emotions and, actions to operate objects through the thoughts of brain using a Non-Invasive Brain Computing Interface (BCI) technique has been proposed.
In this paper a survey on brain and its operations are presented.
The steps involved in the brain signal processing are discussed.
The current systems are able to present few expressions and emotions on a single device.
The proposed system provides the extended number of expressions on multiple numbers of objects.
Paper – Haar Wavelet Based Approach for Image Compression and Quality Assessment of Compressed Image
Today I read a paper titled “Haar Wavelet Based Approach for Image Compression and Quality Assessment of Compressed Image”
The abstract is:
With the increasing growth of technology and the entrance into the digital age, we have to handle a vast amount of information every time which often presents difficulties.
So, the digital information must be stored and retrieved in an efficient and effective manner, in order for it to be put to practical use.
Wavelets provide a mathematical way of encoding information in such a way that it is layered according to level of detail.
This layering facilitates approximations at various intermediate stages.
These approximations can be stored using a lot less space than the original data.
Here a low complex 2D image compression method using wavelets as the basis functions and the approach to measure the quality of the compressed image are presented.
The particular wavelet chosen and used here is the simplest wavelet form namely the Haar Wavelet.
The 2D discret wavelet transform (DWT) has been applied and the detail matrices from the information matrix of the image have been estimated.
The reconstructed image is synthesized using the estimated detail matrices and information matrix provided by the Wavelet transform.
The quality of the compressed images has been evaluated using some factors like Compression Ratio (CR), Peak Signal to Noise Ratio (PSNR), Mean Opinion Score (MOS), Picture Quality Scale (PQS) etc.
Read – Creativity, Inc.
Today I finished reading “Creativity, Inc.: Overcoming the Unseen Forces That Stand in the Way of True Inspiration” by Ed Catmull
Listening – Blunderbuss
This week I am listening to “Blunderbuss” by Jack White
Read – The One Thing You Need to Know
Today I finished reading “The One Thing You Need to Know: … About Great Managing, Great Leading, and Sustained Individual Success” by Marcus Buckingham
Read – Conan the Warrior
Today I finished reading “Conan the Warrior” by Robert Howard
LinkedIn becomes Facebook
“These people shot a lion.”
“My son was in a bad accident and was airlifted to hospital.”
“My daughter survived cancer.”
“My sister graduated.”
“Look at what I had for lunch.”
I am a compassionate soul, but if you visit a curry restaurant and expect to have a hamburger, you’re probably in the wrong place.
And this is also how I feel about Facebook-like status updates on LinkedIn.
Paper – Rendering of 3D Dynamic Virtual Environments
Today I read a paper titled “Rendering of 3D Dynamic Virtual Environments”
The abstract is:
In this paper we present a framework for the rendering of dynamic 3D virtual environments which can be integrated in the development of videogames.
It includes methods to manage sounds and particle effects, paged static geometries, the support of a physics engine and various input systems.
It has been designed with a modular structure to allow future expansions.
We exploited some open-source state-of-the-art components such as OGRE, PhysX, ParticleUniverse, etc.; all of them have been properly integrated to obtain peculiar physical and environmental effects.
The stand-alone version of the application is fully compatible with Direct3D and OpenGL APIs and adopts OpenAL APIs to manage audio cards.
Concluding, we devised a showcase demo which reproduces a dynamic 3D environment, including some particular effects: the alternation of day and night infuencing the lighting of the scene, the rendering of terrain, water and vegetation, the reproduction of sounds and atmospheric agents.
Read – Venture Deals
Today I finished reading “Venture Deals: Be Smarter Than Your Lawyer and Venture Capitalist” by Brad Feld
Listening – ƒIN
This week I am listening to “ƒIN” by Spain John Talabot
Read – Mind Tools
Today I finished reading “Mind Tools: The Five Levels of Mathematical Reality” by Rudy Rucker
Read – Cognitive Agents for Virtual Environments
Today I finished reading “Cognitive Agents for Virtual Environments: First International Workshop, 2012, Revised Selected Papers” by Frank Dignum
Read – Maximum Ride #4
Today I finished reading “Maximum Ride #4” by James Patterson
Paper – Social Structure of Facebook Networks
Today I read a paper titled “Social Structure of Facebook Networks”
The abstract is:
We study the social structure of Facebook “friendship” networks at one hundred American colleges and universities at a single point in time, and we examine the roles of user attributes – gender, class year, major, high school, and residence – at these institutions.
We investigate the influence of common attributes at the dyad level in terms of assortativity coefficients and regression models.
We then examine larger-scale groupings by detecting communities algorithmically and comparing them to network partitions based on the user characteristics.
We thereby compare the relative importances of different characteristics at different institutions, finding for example that common high school is more important to the social organization of large institutions and that the importance of common major varies significantly between institutions.
Our calculations illustrate how microscopic and macroscopic perspectives give complementary insights on the social organization at universities and suggest future studies to investigate such phenomena further.
Paper – Robust Multi-Robot Optimal Path Planning with Temporal Logic Constraints
Today I read a paper titled “Robust Multi-Robot Optimal Path Planning with Temporal Logic Constraints”
The abstract is:
In this paper we present a method for automatically planning robust optimal paths for a group of robots that satisfy a common high level mission specification.
Each robot’s motion in the environment is modeled as a weighted transition system, and the mission is given as a Linear Temporal Logic (LTL) formula over a set of propositions satisfied by the regions of the environment.
In addition, an optimizing proposition must repeatedly be satisfied.
The goal is to minimize the maximum time between satisfying instances of the optimizing proposition while ensuring that the LTL formula is satisfied even with uncertainty in the robots’ traveling times.
We characterize a class of LTL formulas that are robust to robot timing errors, for which we generate optimal paths if no timing errors are present, and we present bounds on the deviation from the optimal values in the presence of errors.
We implement and experimentally evaluate our method considering a persistent monitoring task in a road network environment.
Read – Usagi Yojimbo #26: Traitors of the Earth
Today I finished reading “Usagi Yojimbo #26: Traitors of the Earth” by Stan Sakai
Listening – Storm Corrosion
This week I am listening to “Storm Corrosion” by Storm Corrosion
Paper – Network Archaeology: Uncovering Ancient Networks from Present-day Interactions
Today I read a paper titled “Network Archaeology: Uncovering Ancient Networks from Present-day Interactions”
The abstract is:
Often questions arise about old or extinct networks.
What proteins interacted in a long-extinct ancestor species of yeast? Who were the central players in the Last.fm social network 3 years ago? Our ability to answer such questions has been limited by the unavailability of past versions of networks.
To overcome these limitations, we propose several algorithms for reconstructing a network’s history of growth given only the network as it exists today and a generative model by which the network is believed to have evolved.
Our likelihood-based method finds a probable previous state of the network by reversing the forward growth model.
This approach retains node identities so that the history of individual nodes can be tracked.
We apply these algorithms to uncover older, non-extant biological and social networks believed to have grown via several models, including duplication-mutation with complementarity, forest fire, and preferential attachment.
Through experiments on both synthetic and real-world data, we find that our algorithms can estimate node arrival times, identify anchor nodes from which new nodes copy links, and can reveal significant features of networks that have long since disappeared.
Read – Six Sigma Demystified
Today I finished reading “Six Sigma Demystified, Second Edition” by Paul Keller
Paper – Towards Social Profile Based Overlays
Today I read a paper titled “Towards Social Profile Based Overlays”
The abstract is:
Online social networking has quickly become one of the most common activities of Internet users.
As social networks evolve, they encourage users to share more information, requiring the users, in turn, to place more trust into social networks.
Peer-to-peer (P2P) overlays provide an environment that can return ownership of information, trust, and control to the users, away from centralized third-party social networks.
In this paper, we present a novel concept, social profile overlays, which enable users to share their profile only with trusted peers in a scalable, reliable, and private manner.
Each user’s profile consists of a unique private, secure overlay, where members of that overlay have a friendship with the overlay owner.
Profile data is made available without regard to the online state of the profile owner through the use of the profile overlay’s distributed data store.
Privacy and security are enforced through the use of a public key infrastructure (PKI), where the role of certificate authority (CA) is handled by the overlay owner and each member of the overlay has a CA-signed certificate.
All members of the social network join a common public or directory overlay facilitating friend discovery and bootstrap connections into profile overlays.
We define interfaces and present tools that can be used to implement this system, as well as explore some of the challenges related to it.