This week I am listening to “Sir Lucious Left Foot: The Son Of Chico Dusty” by Big Boi
Archives for 2011
Paper – Neural networks in 3D medical scan visualization
Today I read a paper titled “Neural networks in 3D medical scan visualization”
The abstract is:
For medical volume visualization, one of the most important tasks is to reveal clinically relevant details from the 3D scan (CT, MRI …), e.g.
the coronary arteries, without obscuring them with less significant parts.
These volume datasets contain different materials which are difficult to extract and visualize with 1D transfer functions based solely on the attenuation coefficient.
Multi-dimensional transfer functions allow a much more precise classification of data which makes it easier to separate different surfaces from each other.
Unfortunately, setting up multi-dimensional transfer functions can become a fairly complex task, generally accomplished by trial and error.
This paper explains neural networks, and then presents an efficient way to speed up visualization process by semi-automatic transfer function generation.
We describe how to use neural networks to detect distinctive features shown in the 2D histogram of the volume data and how to use this information for data classification.
Read – How Successful People Think
Today I finished reading “How Successful People Think: Change Your Thinking, Change Your Life” by John Maxwell
Paper – Being Rational or Aggressive? A Revisit to Dunbar’s Number in Online Social Networks
Today I read a paper titled “Being Rational or Aggressive? A Revisit to Dunbar’s Number in Online Social Networks”
The abstract is:
Recent years have witnessed the explosion of online social networks (OSNs).
They provide powerful IT-innovations for online social activities such as organizing contacts, publishing contents, and sharing interests between friends who may never meet before.
As more and more people become the active users of online social networks, one may ponder questions such as: (1) Do OSNs indeed improve our sociability? (2) To what extent can we expand our offline social spectrum in OSNs? (3) Can we identify some interesting user behaviors in OSNs? Our work in this paper just aims to answer these interesting questions.
To this end, we pay a revisit to the well-known Dunbar’s number in online social networks.
Our main research contributions are as follows.
First, to our best knowledge, our work is the first one that systematically validates the existence of the online Dunbar’s number in the range of [200,300].
To reach this, we combine using local-structure analysis and user-interaction analysis for extensive real-world OSNs.
Second, we divide OSNs users into two categories: rational and aggressive, and find that rational users intend to develop close and reciprocated relationships, whereas aggressive users have no consistent behaviors.
Third, we build a simple model to capture the constraints of time and cognition that affect the evolution of online social networks.
Finally, we show the potential use of our findings in viral marketing and privacy management in online social networks.
Read – No Excuses!
Today I finished reading “No Excuses!: The Power of Self-Discipline” by Brian Tracy
Paper – Detecting the Most Unusual Part of a Digital Image
Today I read a paper titled “Detecting the Most Unusual Part of a Digital Image”
The abstract is:
The purpose of this paper is to introduce an algorithm that can detect the most unusual part of a digital image.
The most unusual part of a given shape is defined as a part of the image that has the maximal distance to all non intersecting shapes with the same form.
The method can be used to scan image databases with no clear model of the interesting part or large image databases, as for example medical databases.
Listening – There Is Love In You
This week I am listening to “There Is Love In You” by Four Tet
Paper – Intrusion Detection Using Cost-Sensitive Classification
Today I read a paper titled “Intrusion Detection Using Cost-Sensitive Classification”
The abstract is:
Intrusion Detection is an invaluable part of computer networks defense.
An important consideration is the fact that raising false alarms carries a significantly lower cost than not detecting at- tacks.
For this reason, we examine how cost-sensitive classification methods can be used in Intrusion Detection systems.
The performance of the approach is evaluated under different experimental conditions, cost matrices and different classification models, in terms of expected cost, as well as detection and false alarm rates.
We find that even under unfavourable conditions, cost-sensitive classification can improve performance significantly, if only slightly.
Read – Wuthering Heights
Today I finished reading “Wuthering Heights” by Emily Brontë
Read – The Cathedral and the Bazaar
Today I finished reading “The Cathedral and the Bazaar” by NOT A BOOK
Paper – Thermodynamics of Information Retrieval
Today I read a paper titled “Thermodynamics of Information Retrieval”
The abstract is:
In this work, we suggest a parameterized statistical model (the gamma distribution) for the frequency of word occurrences in long strings of English text and use this model to build a corresponding thermodynamic picture by constructing the partition function.
We then use our partition function to compute thermodynamic quantities such as the free energy and the specific heat.
In this approach, the parameters of the word frequency model vary from word to word so that each word has a different corresponding thermodynamics and we suggest that differences in the specific heat reflect differences in how the words are used in language, differentiating keywords from common and function words.
Finally, we apply our thermodynamic picture to the problem of retrieval of texts based on keywords and suggest some advantages over traditional information retrieval methods.
Listening – The Promise
This week I am listening to “The Promise” by Bruce Springsteen
Read – Waverley
Today I finished reading “Waverley” by Walter Scott
Paper – The Accelerating Growth of Online Tagging Systems
Today I read a paper titled “The Accelerating Growth of Online Tagging Systems”
The abstract is:
Research on the growth of online tagging systems not only is interesting in its own right, but also yields insights for website management and semantic web analysis.
Traditional models that describing the growth of online systems can be divided between linear and nonlinear versions.
Linear models, including the BA model (Brabasi and Albert, 1999), assume that the average activity of users is a constant independent of population.
Hence the total activity is a linear function of population.
On the contrary, nonlinear models suggest that the average activity is affected by the size of the population and the total activity is a nonlinear function of population.
In the current study, supporting evidences for the nonlinear growth assumption are obtained from data on Internet users’ tagging behavior.
A power law relationship between the number of new tags (F) and the population (P), which can be expressed as F ~ P ^ gamma (gamma > 1), is found.
I call this pattern accelerating growth and find it relates the to time-invariant heterogeneity in individual activities.
I also show how a greater heterogeneity leads to a faster growth.
Read – The Feynman Lectures on Physics Vol 5
Today I finished reading “The Feynman Lectures on Physics Vol 5: On Fundamentals/Energy & Motion” by Richard Feynman
Read – Game Programming Gems 8
Today I finished reading “Game Programming Gems 8” by Adam Lake
Read – Problem Identified: And You’re Probably Not Part of the Solution
Today I finished reading “Problem Identified: And You’re Probably Not Part of the Solution” by Scott Adams
Paper – Interactive Hatching and Stippling by Example
Today I read a paper titled “Interactive Hatching and Stippling by Example”
The abstract is:
We describe a system that lets a designer interactively draw patterns of strokes in the picture plane, then guide the synthesis of similar patterns over new picture regions.
Synthesis is based on an initial user-assisted analysis phase in which the system recognizes distinct types of strokes (hatching and stippling) and organizes them according to perceptual grouping criteria.
The synthesized strokes are produced by combining properties (eg.
length, orientation, parallelism, proximity) of the stroke groups extracted from the input examples.
We illustrate our technique with a drawing application that allows the control of attributes and scale-dependent reproduction of the synthesized patterns.
Paper – Assessing Cognitive Load on Web Search Tasks
Today I read a paper titled “Assessing Cognitive Load on Web Search Tasks”
The abstract is:
Assessing cognitive load on web search is useful for characterizing search system features and search tasks with respect to their demands on the searcher’s mental effort.
It is also helpful for examining how individual differences among searchers (e.g.
cognitive abilities) affect the search process.
We examined cognitive load from the perspective of primary and secondary task performance.
A controlled web search study was conducted with 48 participants.
The primary task performance components were found to be significantly related to both the objective and the subjective task difficulty.
However, the relationship between objective and subjective task difficulty and the secondary task performance measures was weaker than expected.
The results indicate that the dual-task approach needs to be used with caution.
Listening – Infinite Arms
This week I am listening to “Infinite Arms” by Band Of Horses
Paper – Simulating Spiking Neural P systems without delays using GPUs
Today I read a paper titled “Simulating Spiking Neural P systems without delays using GPUs”
The abstract is:
We present in this paper our work regarding simulating a type of P system known as a spiking neural P system (SNP system) using graphics processing units (GPUs).
GPUs, because of their architectural optimization for parallel computations, are well-suited for highly parallelizable problems.
Due to the advent of general purpose GPU computing in recent years, GPUs are not limited to graphics and video processing alone, but include computationally intensive scientific and mathematical applications as well.
Moreover P systems, including SNP systems, are inherently and maximally parallel computing models whose inspirations are taken from the functioning and dynamics of a living cell.
In particular, SNP systems try to give a modest but formal representation of a special type of cell known as the neuron and their interactions with one another.
The nature of SNP systems allowed their representation as matrices, which is a crucial step in simulating them on highly parallel devices such as GPUs.
The highly parallel nature of SNP systems necessitate the use of hardware intended for parallel computations.
The simulation algorithms, design considerations, and implementation are presented.
Finally, simulation results, observations, and analyses using an SNP system that generates all numbers in $\mathbb N$ – {1} are discussed, as well as recommendations for future work.
Paper – Analytic treatment of the network synchronization problem with time delays
Today I read a paper titled “Analytic treatment of the network synchronization problem with time delays”
The abstract is:
Motivated by novel results in the theory of network synchronization, we analyze the effects of nonzero time delays in stochastic synchronization problems with linear couplings in an arbitrary network.
We determine {\it analytically} the fundamental limit of synchronization efficiency in a noisy environment with uniform time delays.
We show that the optimal efficiency of the network is achieved for $\lambda\tau={{\pi^{3/2}}\over{2\sqrt{\pi}+4}}\approx0.738$, where $\lambda$ is the coupling strength (relaxation coefficient) and $\tau$ is the characteristic time delay in the communication between pairs of nodes.
Our analysis reveals the underlying mechanism responsible for the trade-off phenomena observed in recent numerical simulations of network synchronization problems.
Listening – Heligoland
This week I am listening to “Heligoland” by Massive Attack
Paper – Subjective Collaborative Filtering
Today I read a paper titled “Subjective Collaborative Filtering”
The abstract is:
We present an item-based approach for collaborative filtering.
We determine a list of recommended items for a user by considering their previous purchases.
Additionally other features of the users could be considered such as page views, search queries, etc…
In particular we address the problem of efficiently comparing items.
Our algorithm can efficiently approximate an estimate of the similarity between two items.
As measure of similarity we use an approximation of the Jaccard similarity that can be computed by constant time operations and one bitwise OR.
Moreover we improve the accuracy of the similarity by introducing the concept of user preference for a given product, which both takes into account multiple purchases and purchases of related items.
The product of the user preference and the Jaccard measure (or its approximation) is used as a score for deciding whether a given product has to be recommended.
Read – Programming Windows® Phone 7
Today I finished reading “Programming Windows® Phone 7” by Charles Petzold
Studying – Digital inking in Photoshop
This month I am studying “Digital inking in Photoshop”
Listening – Learning
This week I am listening to “Learning” by Perfume Genius
Paper – Digital Restoration of Ancient Papyri
Today I read a paper titled “Digital Restoration of Ancient Papyri”
The abstract is:
Image processing can be used for digital restoration of ancient papyri, that is, for a restoration performed on their digital images.
The digital manipulation allows reducing the background signals and enhancing the readability of texts.
In the case of very old and damaged documents, this is fundamental for identification of the patterns of letters.
Some examples of restoration, obtained with an image processing which uses edges detection and Fourier filtering, are shown.
One of them concerns 7Q5 fragment of the Dead Sea Scrolls.
Paper – Extensive Games with Possibly Unaware Players
Today I read a paper titled “Extensive Games with Possibly Unaware Players”
The abstract is:
Standard game theory assumes that the structure of the game is common knowledge among players.
We relax this assumption by considering extensive games where agents may be unaware of the complete structure of the game.
In particular, they may not be aware of moves that they and other agents can make.
We show how such games can be represented; the key idea is to describe the game from the point of view of every agent at every node of the game tree.
We provide a generalization of Nash equilibrium and show that every game with awareness has a generalized Nash equilibrium.
Finally, we extend these results to games with awareness of unawareness, where a player i may be aware that a player j can make moves that i is not aware of, and to subjective games, where payers may have no common knowledge regarding the actual game and their beliefs are incompatible with a common prior.
Paper – A 8 bits Pipeline Analog to Digital Converter Design for High Speed Camera Application
Today I read a paper titled “A 8 bits Pipeline Analog to Digital Converter Design for High Speed Camera Application”
The abstract is:
– This paper describes a pipeline analog-to-digital converter is implemented for high speed camera.
In the pipeline ADC design, prime factor is designing operational amplifier with high gain so ADC have been high speed.
The other advantage of pipeline is simple on concept, easy to implement in layout and have flexibility to increase speed.
We made design and simulation using Mentor Graphics Software with 0.6 \mu m CMOS technology with a total power dissipation of 75.47 mW.
Circuit techniques used include a precise comparator, operational amplifier and clock management.
A switched capacitor is used to sample and multiplying at each stage.
Simulation a worst case DNL and INL of 0.75 LSB.
The design operates at 5 V dc.
The ADC achieves a SNDR of 44.86 dB.
keywords: pipeline, switched capacitor, clock management .
Paper – Self-Assembly with Geometric Tiles
Today I read a paper titled “Self-Assembly with Geometric Tiles”
The abstract is:
In this work we propose a generalization of Winfree’s abstract Tile Assembly Model (aTAM) in which tile types are assigned rigid shapes, or geometries, along each tile face.
We examine the number of distinct tile types needed to assemble shapes within this model, the temperature required for efficient assembly, and the problem of designing compact geometric faces to meet given compatibility specifications.
Our results show a dramatic decrease in the number of tile types needed to assemble $n \times n$ squares to $\Theta(\sqrt{\log n})$ at temperature 1 for the most simple model which meets a lower bound from Kolmogorov complexity, and $O(\log\log n)$ in a model in which tile aggregates must move together through obstacle free paths within the plane.
This stands in contrast to the $\Theta(\log n / \log\log n)$ tile types at temperature 2 needed in the basic aTAM.
We also provide a general method for simulating a large and computationally universal class of temperature 2 aTAM systems with geometric tiles at temperature 1.
Finally, we consider the problem of computing a set of compact geometric faces for a tile system to implement a given set of compatibility specifications.
We show a number of bounds on the complexity of geometry size needed for various classes of compatibility specifications, many of which we directly apply to our tile assembly results to achieve non-trivial reductions in geometry size.
Paper – How happy is your web browsing? A model to quantify satisfaction of an Internet user, searching for desired information
Today I read a paper titled “How happy is your web browsing? A model to quantify satisfaction of an Internet user, searching for desired information”
The abstract is:
We feel happy when web-browsing operations provide us with necessary information; otherwise, we feel bitter.
How to measure this happiness (or bitterness)? How does the profile of happiness grow and decay during the course of web-browsing? We propose a probabilistic framework that models evolution of user satisfaction, on top of his/her continuous frustration at not finding the required information.
It is found that the cumulative satisfaction profile of a web-searching individual can be modeled effectively as the sum of random number of random terms, where each term is mutually independent random variable, originating from ‘memoryless’ Poisson flow.
Evolution of satisfaction over the entire time interval of user’s browsing was modeled with auto-correlation analysis.
A utilitarian marker, magnitude of greater than unity of which describe happy web-searching operations; and an empirical limit that connects user’s satisfaction with his frustration level – are proposed too.
Presence of pertinent information in the very first page of a web-site and magnitude of the decay parameter of user satisfaction (frustration, irritation etc.), are found to be two key aspects that dominate web-browser’s psychology.
The proposed model employed different combination of decay parameter, searching time and number of helpful web-sites.
Obtained results are found to match the results from three real-life case-studies.
Listening – Transference
This week I am listening to “Transference” by Spoon
Paper – The Digital Restoration of Da Vinci’s Sketches
Today I read a paper titled “The Digital Restoration of Da Vinci’s Sketches”
The abstract is:
A sketch, found in one of Leonardo da Vinci’s notebooks and covered by the written notes of this genius, has been recently restored.
The restoration reveals a possible self-portrait of the artist, drawn when he was young.
Here, we discuss the discovery of this self-portrait and the procedure used for restoration.
Actually, this is a restoration performed on the digital image of the sketch, a procedure that can easily extended and applied to ancient documents for studies of art and palaeography.
Paper – On the effect of the path length and transitivity of small-world networks on epidemic dynamics
Today I read a paper titled “On the effect of the path length and transitivity of small-world networks on epidemic dynamics”
The abstract is:
We show how one can trace in a systematic way the coarse-grained solutions of individual-based stochastic epidemic models evolving on heterogeneous complex networks with respect to their topological characteristics.
In particular, we have developed algorithms that allow the tuning of the transitivity (clustering coefficient) and the average mean-path length allowing the investigation of the “pure” impacts of the two characteristics on the emergent behavior of detailed epidemic models.
The framework could be used to shed more light into the influence of weak and strong social ties on epidemic spread within small-world network structures, and ultimately to provide novel systematic computational modeling and exploration of better contagion control strategies.
Read – Perfect Phrases for Leadership Development
Today I finished reading “Perfect Phrases for Leadership Development” by Meryl Runion
Read – The Complete Book of Perfect Phrases for High-Performing Sales Professionals
Today I finished reading “The Complete Book of Perfect Phrases for High-Performing Sales Professionals” by William T. Brooks
Read – Physically Based Rendering
Today I finished reading “Physically Based Rendering: From Theory to Implementation” by Matt Pharr
Paper – On Endogenous Reconfiguration in Mobile Robotic Networks
Today I read a paper titled “On Endogenous Reconfiguration in Mobile Robotic Networks”
The abstract is:
In this paper, our focus is on certain applications for mobile robotic networks, where reconfiguration is driven by factors intrinsic to the network rather than changes in the external environment.
In particular, we study a version of the coverage problem useful for surveillance applications, where the objective is to position the robots in order to minimize the average distance from a random point in a given environment to the closest robot.
This problem has been well-studied for omni-directional robots and it is shown that optimal configuration for the network is a centroidal Voronoi configuration and that the coverage cost belongs to $\Theta(m^{-1/2})$, where $m$ is the number of robots in the network.
In this paper, we study this problem for more realistic models of robots, namely the double integrator (DI) model and the differential drive (DD) model.
We observe that the introduction of these motion constraints in the algorithm design problem gives rise to an interesting behavior.
For a \emph{sparser} network, the optimal algorithm for these models of robots mimics that for omni-directional robots.
We propose novel algorithms whose performances are within a constant factor of the optimal asymptotically (i.e., as $m \to +\infty$).
In particular, we prove that the coverage cost for the DI and DD models of robots is of order $m^{-1/3}$.
Additionally, we show that, as the network grows, these novel algorithms outperform the conventional algorithm; hence necessitating a reconfiguration in the network in order to maintain optimal quality of service.
Listening – Gemini
This week I am listening to “Gemini” by Wild Nothing
Read – Maximum Ride #3
Today I finished reading “Maximum Ride #3” by James Patterson
Paper – An Image-Based Sensor System for Autonomous Rendez-Vous with Uncooperative Satellites
Today I read a paper titled “An Image-Based Sensor System for Autonomous Rendez-Vous with Uncooperative Satellites”
The abstract is:
In this paper are described the image processing algorithms developed by SENER, Ingenieria y Sistemas to cope with the problem of image-based, autonomous rendez-vous (RV) with an orbiting satellite.
The methods developed have a direct application in the OLEV (Orbital Life Extension Extension Vehicle) mission.
OLEV is a commercial mission under development by a consortium formed by Swedish Space Corporation, Kayser-Threde and SENER, aimed to extend the operational life of geostationary telecommunication satellites by supplying them control, navigation and guidance services.
OLEV is planned to use a set of cameras to determine the angular position and distance to the client satellite during the complete phases of rendez-vous and docking, thus enabling the operation with satellites not equipped with any specific navigational aid to provide support during the approach.
The ability to operate with un-equipped client satellites significantly expands the range of applicability of the system under development, compared to other competing video technologies already tested in previous spatial missions, such as the ones described here below.
Listening – Halcyon Digest
This week I am listening to “Halcyon Digest” by Deerhunter
Paper – Geographic constraints on social network groups
Today I read a paper titled “Geographic constraints on social network groups”
The abstract is:
Social groups are fundamental building blocks of human societies.
While our social interactions have always been constrained by geography, it has been impossible, due to practical difficulties, to evaluate the nature of this restriction on social group structure.
We construct a social network of individuals whose most frequent geographical locations are also known.
We also classify the individuals into groups according to a community detection algorithm.
We study the variation of geographical span for social groups of varying sizes, and explore the relationship between topological positions and geographic positions of their members.
We find that small social groups are geographically very tight, but become much more clumped when the group size exceeds about 30 members.
Also, we find no correlation between the topological positions and geographic positions of individuals within network communities.
These results suggest that spreading processes face distinct structural and spatial constraints.
Paper – Computing Good Nash Equilibria in Graphical Games
Today I read a paper titled “Computing Good Nash Equilibria in Graphical Games”
The abstract is:
This paper addresses the problem of fair equilibrium selection in graphical games.
Our approach is based on the data structure called the {\em best response policy}, which was proposed by Kearns et al.
\cite{kls} as a way to represent all Nash equilibria of a graphical game.
In \cite{egg}, it was shown that the best response policy has polynomial size as long as the underlying graph is a path.
In this paper, we show that if the underlying graph is a bounded-degree tree and the best response policy has polynomial size then there is an efficient algorithm which constructs a Nash equilibrium that guarantees certain payoffs to all participants.
Another attractive solution concept is a Nash equilibrium that maximizes the social welfare.
We show that, while exactly computing the latter is infeasible (we prove that solving this problem may involve algebraic numbers of an arbitrarily high degree), there exists an FPTAS for finding such an equilibrium as long as the best response policy has polynomial size.
These two algorithms can be combined to produce Nash equilibria that satisfy various fairness criteria.
Studying – Social Media Marketing: Marketing with a social media presence
This month I am studying “Social Media Marketing: Marketing with a social media presence”
Two full Saturdays. Starts at 8AM. Ugh. Not gonna enjoy waking up.
Paper – Across Browsers SVG Implementation
Today I read a paper titled “Across Browsers SVG Implementation”
The abstract is:
In this work SVG will be translated into VML or HTML by using Javascript based on Backbase Client Framework.
The target of this project is to implement SVG to be viewed in Internet Explorer without any plug-in and work together with other Backbase Client Framework languages.
The result of this project will be added as an extension to the current Backbase Client Framework.
Listening – Teen Dream
This week I am listening to “Teen Dream” by Beach House
Read – It’s Not Funny if I Have to Explain It
Today I finished reading “It’s Not Funny if I Have to Explain It” by Scott Adams
Paper – Free and Open-Source Software is not an Emerging Property but Rather the Result of Studied Design
Today I read a paper titled “Free and Open-Source Software is not an Emerging Property but Rather the Result of Studied Design”
The abstract is:
Free and open source software (FOSS) is considered by many, along with Wikipedia, the proof of an ongoing paradigm shift from hierarchically-managed and market-driven production of knowledge to heterarchical, collaborative and commons-based production styles.
In such perspective, it has become common place to refer to FOSS as a manifestation of collective intelligence where deliverables and artefacts emerge by virtue of mere cooperation, with no need for supervising leadership.
The paper argues that this assumption is based on limited understanding of the software development process, and may lead to wrong conclusions as to the potential of peer production.
The development of a less than trivial piece of software, irrespective of whether it be FOSS or proprietary, is a complex cooperative effort requiring the participation of many (often thousands of) individuals.
A subset of the participants always play the role of leading system and subsystem designers, determining architecture and functionality; the rest of the people work “underneath” them in a logical, functional sense.
While new and powerful forces, including FOSS, are clearly at work in the post-industrial, networked econ-omy, the currently ingenuous stage of research in the field of collective intelligence and networked cooperation must give way to a deeper level of consciousness, which requires an understanding of the software development process.