This week I am listening to “Gorilla Manor” by Local Natives
Paper – Node discovery in a networked organization
Today I read a paper titled “Node discovery in a networked organization”
The abstract is:
In this paper, I present a method to solve a node discovery problem in a networked organization.
Covert nodes refer to the nodes which are not observable directly.
They affect social interactions, but do not appear in the surveillance logs which record the participants of the social interactions.
Discovering the covert nodes is defined as identifying the suspicious logs where the covert nodes would appear if the covert nodes became overt.
A mathematical model is developed for the maximal likelihood estimation of the network behind the social interactions and for the identification of the suspicious logs.
Precision, recall, and F measure characteristics are demonstrated with the dataset generated from a real organization and the computationally synthesized datasets.
The performance is close to the theoretical limit for any covert nodes in the networks of any topologies and sizes if the ratio of the number of observation to the number of possible communication patterns is large.
Listening – Big Whiskey And The GrooGrux King
This week I am listening to “Big Whiskey And The GrooGrux King” by Dave Matthews Band
Paper – An Iterative Fingerprint Enhancement Algorithm Based on Accurate Determination of Orientation Flow
Today I read a paper titled “An Iterative Fingerprint Enhancement Algorithm Based on Accurate Determination of Orientation Flow”
The abstract is:
We describe an algorithm to enhance and binarize a fingerprint image.
The algorithm is based on accurate determination of orientation flow of the ridges of the fingerprint image by computing variance of the neighborhood pixels around a pixel in different directions.
We show that an iterative algorithm which captures the mutual interdependence of orientation flow computation, enhancement and binarization gives very good results on poor quality images.
Paper – Review of Robust Video Watermarking Algorithms
Today I read a paper titled “Review of Robust Video Watermarking Algorithms”
The abstract is:
There has been a remarkable increase in the data exchange over web and the widespread use of digital media.
As a result, multimedia data transfers also had a boost up.
The mounting interest with reference to digital watermarking throughout the last decade is certainly due to the increase in the need of copyright protection of digital content.
This is also enhanced due to commercial prospective.
Applications of video watermarking in copy control, broadcast monitoring, fingerprinting, video authentication, copyright protection etc is immensely rising.
The main aspects of information hiding are capacity, security and robustness.
Capacity deals with the amount of information that can be hidden.
The skill of anyone detecting the information is security and robustness refers to the resistance to modification of the cover content before concealed information is destroyed.
Video watermarking algorithms normally prefers robustness.
In a robust algorithm it is not possible to eliminate the watermark without rigorous degradation of the cover content.
In this paper, we introduce the notion of Video Watermarking and the features required to design a robust watermarked video for a valuable application.
We review several algorithms, and introduce frequently used key techniques.
The aim of this paper is to focus on the various domains of video watermarking techniques.
The majority of the reviewed methods based on video watermarking emphasize on the notion of robustness of the algorithm.
Automated automobile
My future Mother-in-law loves to drive.
Long road trips all over the state.
She pays attention to every road sign.
The stopping distance to the car in front.
The current speed limit
Children and small animals that may dart out in to the road.
The person opening their car door just ahead.
The bicyclist swerving in and out of cars.
I do all the operating of the vehicle.
But my future Mother-in-law does all the driving.
Paper – The Many Faces of Rationalizability
Today I read a paper titled “The Many Faces of Rationalizability”
The abstract is:
The rationalizability concept was introduced in \cite{Ber84} and \cite{Pea84} to assess what can be inferred by rational players in a non-cooperative game in the presence of common knowledge.
However, this notion can be defined in a number of ways that differ in seemingly unimportant minor details.
We shed light on these differences, explain their impact, and clarify for which games these definitions coincide.
Then we apply the same analysis to explain the differences and similarities between various ways the iterated elimination of strictly dominated strategies was defined in the literature.
This allows us to clarify the results of \cite{DS02} and \cite{CLL05} and improve upon them.
We also consider the extension of these results to strict dominance by a mixed strategy.
Our approach is based on a general study of the operators on complete lattices.
We allow transfinite iterations of the considered operators and clarify the need for them.
The advantage of such a general approach is that a number of results, including order independence for some of the notions of rationalizability and strict dominance, come for free.
Read – SuperFreakonomics
Today I finished reading “SuperFreakonomics: Global Cooling, Patriotic Prostitutes And Why Suicide Bombers Should Buy Life Insurance” by Steven Levitt
Listening – Fever Ray
This week I am listening to “Fever Ray” by Fever Ray
Read – Death of a Spaceman
Today I finished reading “Death of a Spaceman” by Walter M. Miller Jr.
Paper – How to realize “a sense of humour” in computers ?
Today I read a paper titled “How to realize “a sense of humour” in computers ?”
The abstract is:
Computer model of a “sense of humour” suggested previously [arXiv:0711.2058, 0711.2061, 0711.2270] is raised to the level of a realistic algorithm.
Listening – Merriweather Post Pavilion
This week I am listening to “Merriweather Post Pavilion” by Animal Collective
Read – Ender in Exile
Today I finished reading “Ender in Exile” by Orson Scott Card
Read – Continuous Delivery
Today I finished reading “Continuous Delivery: Reliable Software Releases Through Build, Test, and Deployment Automation” by Jez Humble
Paper – A Standalone Markerless 3D Tracker for Handheld Augmented Reality
Today I read a paper titled “A Standalone Markerless 3D Tracker for Handheld Augmented Reality”
The abstract is:
This paper presents an implementation of a markerless tracking technique targeted to the Windows Mobile Pocket PC platform.
The primary aim of this work is to allow the development of standalone augmented reality applications for handheld devices based on natural feature tracking.
In order to achieve this goal, a subset of two computer vision libraries was ported to the Pocket PC platform.
They were also adapted to use fixed point math, with the purpose of improving the overall performance of the routines.
The port of these libraries opens up the possibility of having other computer vision tasks being executed on mobile platforms.
A model based tracking approach that relies on edge information was adopted.
Since it does not require a high processing power, it is suitable for constrained devices such as handhelds.
The OpenGL ES graphics library was used to perform computer vision tasks, taking advantage of existing graphics hardware acceleration.
An augmented reality application was created using the implemented technique and evaluations were done regarding tracking performance and accuracy .
Listening – 21st Century Breakdown
This week I am listening to “21st Century Breakdown” by Green Day
Read – The Presentation Secrets of Steve Jobs
Today I finished reading “The Presentation Secrets of Steve Jobs” by Carmine Gallo
Read – Perfect Phrases for Dealing with Difficult People
Today I finished reading “Perfect Phrases for Dealing with Difficult People: Hundreds of Ready-To-Use Phrases for Handling Conflict, Confrontations, and Challenging Personalities” by Susan Benjamin
Studying – Git essential training
This month I am studying “Git essential training”
Of course I’ve used version control systems since the early 1980’s.
First methods were more process — swapping C-15 tapes and later 100KB 5 1/4″ floppy discs — and then more sophisticated methods such as RKS and CVS and on to SVN and Visual SourceSafe.
DVCS (Distributed Version Control) has been around a while, and I am very familiar with Mercurial, but have only a smattering of experience with Git — used it, but don’t really know some of the more esoteric commands.
This Git class should be a good way to acquaint myself with the arcane side of the utility.
Listening – Sainthood
This week I am listening to “Sainthood” by Tegan And Sara
Read – Project Management Techniques
Today I finished reading “Project Management Techniques” by Rory Burke
Read – You Can Choose to Be Rich
Today I finished reading “You Can Choose to Be Rich” by Robert T. Kiyosaki
Paper – Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation
Today I read a paper titled “Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation”
The abstract is:
This paper presents a robotics vision-based heuristic reasoning system for underwater target tracking and navigation.
This system is introduced to improve the level of automation of underwater Remote Operated Vehicles (ROVs) operations.
A prototype which combines computer vision with an underwater robotics system is successfully designed and developed to perform target tracking and intelligent navigation.
…
Read – The Feynman Lectures on Physics Vol 4 : Electrical and Magnetic Behavior
Today I finished reading “The Feynman Lectures on Physics Vol 4 : Electrical and Magnetic Behavior” by Richard Feynman
Paper – Improving Local Search for Fuzzy Scheduling Problems
Today I read a paper titled “Improving Local Search for Fuzzy Scheduling Problems”
The abstract is:
The integration of fuzzy set theory and fuzzy logic into scheduling is a rather new aspect with growing importance for manufacturing applications, resulting in various unsolved aspects.
In the current paper, we investigate an improved local search technique for fuzzy scheduling problems with fitness plateaus, using a multi criteria formulation of the problem.
We especially address the problem of changing job priorities over time as studied at the Sherwood Press Ltd, a Nottingham based printing company, who is a collaborator on the project.
Listening – Manners
This week I am listening to “Manners” by Passion Pit
Read – Usagi Yojimbo #23: Bridge of Tears
Today I finished reading “Usagi Yojimbo #23: Bridge of Tears” by Stan Sakai
Paper – An Independent Evaluation of Subspace Face Recognition Algorithms
Today I read a paper titled “An Independent Evaluation of Subspace Face Recognition Algorithms”
The abstract is:
This paper explores a comparative study of both the linear and kernel implementations of three of the most popular Appearance-based Face Recognition projection classes, these being the methodologies of Principal Component Analysis, Linear Discriminant Analysis and Independent Component Analysis.
The experimental procedure provides a platform of equal working conditions and examines the ten algorithms in the categories of expression, illumination, occlusion and temporal delay.
The results are then evaluated based on a sequential combination of assessment tools that facilitate both intuitive and statistical decisiveness among the intra and interclass comparisons.
The best categorical algorithms are then incorporated into a hybrid methodology, where the advantageous effects of fusion strategies are considered.
Listening – Beggars
This week I am listening to “Beggars” by Thrice
Cats!
Cats are highly efficient machines for converting food in to sleep.
Paper – Double Sided Watermark Embedding and Detection with Perceptual Analysis
Today I read a paper titled “Double Sided Watermark Embedding and Detection with Perceptual Analysis”
The abstract is:
In our previous work, we introduced a double-sided technique that utilizes but not reject the host interference.
Due to its nice property of utilizing but not rejecting the host interference, it has a big advantage over the host interference schemes in that the perceptual analysis can be easily implemented for our scheme to achieve the locally bounded maximum embedding strength.
Thus, in this work, we detail how to implement the perceptual analysis in our double-sided schemes since the perceptual analysis is very important for improving the fidelity of watermarked contents.
Through the extensive performance comparisons, we can further validate the performance advantage of our double-sided schemes.
Read – The Art of Happiness
Today I finished reading “The Art of Happiness” by Dalai Lama XIV
Listening – Billy Talent III
This week I am listening to “Billy Talent III” by Billy Talent
Paper – Establishing A Minimum Generic Skill Set For Risk Management Teaching In A Spreadsheet Training Course
Today I read a paper titled “Establishing A Minimum Generic Skill Set For Risk Management Teaching In A Spreadsheet Training Course”
The abstract is:
Past research shows that spreadsheet models are prone to such a high frequency of errors and data security implications that the risk management of spreadsheet development and spreadsheet use is of great importance to both industry and academia.
The underlying rationale for this paper is that spreadsheet training courses should specifically address risk management in the development process both from a generic and a domain-specific viewpoint.
This research specifically focuses on one of these namely those generic issues of risk management that should be present in a training course that attempts to meet good-practice within industry.
A pilot questionnaire was constructed showing a possible minimum set of risk management issues and sent to academics and industry practitioners for feedback.
The findings from this pilot survey will be used to refine the questionnaire for sending to a larger body of possible respondents.
It is expected these findings will form the basis of a risk management teaching approach to be trialled in a number of selected ongoing spreadsheet training courses.
Read – The Meaning of It All
Today I finished reading “The Meaning of It All: Thoughts of a Citizen-Scientist” by Richard Feynman
Paper – A Walk in Facebook: Uniform Sampling of Users in Online Social Networks
Today I read a paper titled “A Walk in Facebook: Uniform Sampling of Users in Online Social Networks”
The abstract is:
Our goal in this paper is to develop a practical framework for obtaining a uniform sample of users in an online social network (OSN) by crawling its social graph.
Such a sample allows to estimate any user property and some topological properties as well.
To this end, first, we consider and compare several candidate crawling techniques.
Two approaches that can produce approximately uniform samples are the Metropolis-Hasting random walk (MHRW) and a re-weighted random walk (RWRW).
Both have pros and cons, which we demonstrate through a comparison to each other as well as to the “ground truth.” In contrast, using Breadth-First-Search (BFS) or an unadjusted Random Walk (RW) leads to substantially biased results.
Second, and in addition to offline performance assessment, we introduce online formal convergence diagnostics to assess sample quality during the data collection process.
We show how these diagnostics can be used to effectively determine when a random walk sample is of adequate size and quality.
Third, as a case study, we apply the above methods to Facebook and we collect the first, to the best of our knowledge, representative sample of Facebook users.
We make it publicly available and employ it to characterize several key properties of Facebook.
Studying – Building data driven iOS apps
This month I am studying “Building data driven iOS apps”
Listening – Hymn To The Immortal Wind
This week I am listening to “Hymn To The Immortal Wind” by Japan Mono
Read – The Feynman Lectures on Physics Vol 9
Today I finished reading “The Feynman Lectures on Physics Vol 9” by Richard Feynman
Paper – New Intelligent Transmission Concept for Hybrid Mobile Robot Speed Control
Today I read a paper titled “New Intelligent Transmission Concept for Hybrid Mobile Robot Speed Control”
The abstract is:
This paper presents a new concept of a mobile robot speed control by using two degree of freedom gear transmission.
The developed intelligent speed controller utilizes a gear box which comprises of epicyclic gear train with two inputs, one coupled with the engine shaft and another with the shaft of a variable speed dc motor.
The net output speed is a combination of the two input speeds and is governed by the transmission ratio of the planetary gear train.
This new approach eliminates the use of a torque converter which is otherwise an indispensable part of all available automatic transmissions, thereby reducing the power loss that occurs in the box during the fluid coupling.
By gradually varying the speed of the dc motor a stepless transmission has been achieved.
The other advantages of the developed controller are pulling over and reversing the vehicle, implemented by intelligent mixing of the dc motor and engine speeds.
This approach eliminates traditional braking system in entire vehicle design.
The use of two power sources, IC engine and battery driven DC motor, utilizes the modern idea of hybrid vehicles.
The new mobile robot speed controller is capable of driving the vehicle even in extreme case of IC engine failure, for example, due to gas depletion.
Read – The Walking Dead, Book One
Today I finished reading “The Walking Dead, Book One” by Robert Kirkman
Listening – Relapse
This week I am listening to “Relapse” by Eminem
Read – What Matters Now
Today I finished reading “What Matters Now” by Seth Godin
Paper – Spatiotemporal sensistivity and visual attention for efficient rendering of dynamic environments
Today I read a paper titled “Spatiotemporal sensistivity and visual attention for efficient rendering of dynamic environments”
The abstract is:
We present a method to accelerate global illumination computation in dynamic environments by taking advantage of limitations of the human visual system.
A model of visual attention is used to locate regions of interest in a scene and to modulate spatiotemporal sensitivity.
The method is applied in the form of a spatiotemporal error tolerance map.
Perceptual acceleration combined with good sampling protocols provide a global illumination solution feasible for use in animation.
Results indicate an order of magnitude improvement in computational speed.
The method is adaptable and can also be used in image-based rendering, geometry level of detail selection, realistic image synthesis, video telephony and video compression.
Paper – Modular Traffic Sign Recognition applied to on-vehicle real-time visual detection of American and European speed limit signs
Today I read a paper titled “Modular Traffic Sign Recognition applied to on-vehicle real-time visual detection of American and European speed limit signs”
The abstract is:
We present a new modular traffic signs recognition system, successfully applied to both American and European speed limit signs.
Our sign detection step is based only on shape-detection (rectangles or circles).
This enables it to work on grayscale images, contrary to most European competitors, which eases robustness to illumination conditions (notably night operation).
Speed sign candidates are classified (or rejected) by segmenting potential digits inside them (which is rather original and has several advantages), and then applying a neural digit recognition.
The global detection rate is ~90% for both (standard) U.S.
and E.U.
speed signs, with a misclassification rate <1%, and no validated false alarm in >150 minutes of video.
The system processes in real-time ~20 frames/s on a standard high-end laptop.
Listening – Homesick
This week I am listening to “Homesick” by A Day To Remember
Paper – Image Authentication Based on Neural Networks
Today I read a paper titled “Image Authentication Based on Neural Networks”
The abstract is:
Neural network has been attracting more and more researchers since the past decades.
The properties, such as parameter sensitivity, random similarity, learning ability, etc., make it suitable for information protection, such as data encryption, data authentication, intrusion detection, etc.
In this paper, by investigating neural networks’ properties, the low-cost authentication method based on neural networks is proposed and used to authenticate images or videos.
The authentication method can detect whether the images or videos are modified maliciously.
Firstly, this chapter introduces neural networks’ properties, such as parameter sensitivity, random similarity, diffusion property, confusion property, one-way property, etc.
Secondly, the chapter gives an introduction to neural network based protection methods.
Thirdly, an image or video authentication scheme based on neural networks is presented, and its performances, including security, robustness and efficiency, are analyzed.
Finally, conclusions are drawn, and some open issues in this field are presented.
Read – Game Engine Architecture
Today I finished reading “Game Engine Architecture” by Jason Gregory
Paper – Strip Packing vs. Bin Packing
Today I read a paper titled “Strip Packing vs. Bin Packing”
The abstract is:
In this paper we establish a general algorithmic framework between bin packing and strip packing, with which we achieve the same asymptotic bounds by applying bin packing algorithms to strip packing.
More precisely we obtain the following results: (1) Any offline bin packing algorithm can be applied to strip packing maintaining the same asymptotic worst-case ratio.
Thus using FFD (MFFD) as a subroutine, we get a practical (simple and fast) algorithm for strip packing with an upper bound 11/9 (71/60).
A simple AFPTAS for strip packing immediately follows.
(2) A class of Harmonic-based algorithms for bin packing can be applied to online strip packing maintaining the same asymptotic competitive ratio.
It implies online strip packing admits an upper bound of 1.58889 on the asymptotic competitive ratio, which is very close to the lower bound 1.5401 and significantly improves the previously best bound of 1.6910 and affirmatively answers an open question posed by Csirik et.
al.
Read – Math Proofs Demystified
Today I finished reading “Math Proofs Demystified” by Stan Gibilisco