Since getting married I feel like my home maintenance to do list has become like the Titanic.
I keep wishing my wife were evacuating as part of “women and children first” so I can actually have some silence and think quietly for a minute.
Somebody needs to think about this stuff...
by justin
Since getting married I feel like my home maintenance to do list has become like the Titanic.
I keep wishing my wife were evacuating as part of “women and children first” so I can actually have some silence and think quietly for a minute.
by justin
Today I finished reading “Death Note #7: Zero” by Tsugumi Ohba
by justin
Today I read a paper titled “Perceptual analyses of action-related impact sounds”
The abstract is:
Among environmental sounds, we have chosen to study a class of action-related impact sounds: automobile door closure sounds
We propose to describe these sounds using a model composed of perceptual properties
The development of the perceptual model was derived from the evaluation of many door closure sounds measured under controlled laboratory listening conditions
However, listening to such sounds normally occurs within a natural context, which probably modifies their perception
We therefore need to study differences between the real situation and the laboratory situation by following standard practices in order to specify the precise listening conditions and observe the influence of previous learning, expectations, action-perception interactions, and attention given to sounds
Our process consists in doing in situ experiments that are compared with specific laboratory experiments in order to isolate certain influential, context dependent components
by justin
Today I finished reading “Gaston 17” by Andre Franquin
by justin
Today I read a paper titled “Herding the Crowd: Automated Planning for Crowdsourced Planning”
The abstract is:
There has been significant interest in crowdsourcing and human computation
One subclass of human computation applications are those directed at tasks that involve planning (e.g
travel planning) and scheduling (e.g
conference scheduling)
Much of this work appears outside the traditional automated planning forums, and at the outset it is not clear whether automated planning has much of a role to play in these human computation systems
Interestingly however, work on these systems shows that even primitive forms of automated oversight of the human planner does help in significantly improving the effectiveness of the humans/crowd
In this paper, we will argue that the automated oversight used in these systems can be viewed as a primitive automated planner, and that there are several opportunities for more sophisticated automated planning in effectively steering crowdsourced planning
Straightforward adaptation of current planning technology is however hampered by the mismatch between the capabilities of human workers and automated planners
We identify two important challenges that need to be overcome before such adaptation of planning technology can occur: (i) interpreting the inputs of the human workers (and the requester) and (ii) steering or critiquing the plans being produced by the human workers armed only with incomplete domain and preference models
In this paper, we discuss approaches for handling these challenges, and characterize existing human computation systems in terms of the specific choices they make in handling these challenges
by justin
Today I read a paper titled “A new muscle fatigue and recovery model and its ergonomics application in human simulation”
The abstract is:
Although automatic techniques have been employed in manufacturing industries to increase productivity and efficiency, there are still lots of manual handling jobs, especially for assembly and maintenance jobs
In these jobs, musculoskeletal disorders (MSDs) are one of the major health problems due to overload and cumulative physical fatigue
With combination of conventional posture analysis techniques, digital human modelling and simulation (DHM) techniques have been developed and commercialized to evaluate the potential physical exposures
However, those ergonomics analysis tools are mainly based on posture analysis techniques, and until now there is still no fatigue index available in the commercial software to evaluate the physical fatigue easily and quickly
In this paper, a new muscle fatigue and recovery model is proposed and extended to evaluate joint fatigue level in manual handling jobs
A special application case is described and analyzed by digital human simulation technique
by justin
Today I finished reading “Gaston 16” by Andre Franquin
by justin
Today I finished reading “Gaston 15” by Andre Franquin
by justin
Today I finished reading “Death Note #6: Give-and-Take” by Tsugumi Ohba
by justin
Today I finished reading “Gaston 14” by Andre Franquin
by justin
This week I am listening to “Field Of Reeds” by These New Puritans
by justin
Today I finished reading “Gaston 13” by Andre Franquin
by justin
Today I finished reading “Lucky Luke – L’Integrale 14” by Morris
by justin
Today I read a paper titled “Let Us Dance Just a Little Bit More — On the Information Capacity of the Human Motor System”
The abstract is:
Fitts’ law is a fundamental tool in measuring the capacity of the human motor system
However, it is, by definition, limited to aimed movements toward spatially expanded targets
We revisit its information-theoretic basis with the goal of generalizing it into unconstrained trained movement such as dance and sports
The proposed new measure is based on a subject’s ability to accurately reproduce a complex movement pattern
We demonstrate our framework using motion-capture data from professional dance performances
by justin
Today I read a paper titled “Cutaneous Force Feedback as a Sensory Subtraction Technique in Haptics”
The abstract is:
A novel sensory substitution technique is presented
Kinesthetic and cutaneous force feedback are substituted by cutaneous feedback (CF) only, provided by two wearable devices able to apply forces to the index finger and the thumb, while holding a handle during a teleoperation task
The force pattern, fed back to the user while using the cutaneous devices, is similar, in terms of intensity and area of application, to the cutaneous force pattern applied to the finger pad while interacting with a haptic device providing both cutaneous and kinesthetic force feedback
The pattern generated using the cutaneous devices can be thought as a subtraction between the complete haptic feedback (HF) and the kinesthetic part of it
For this reason, we refer to this approach as sensory subtraction instead of sensory substitution
A needle insertion scenario is considered to validate the approach
The haptic device is connected to a virtual environment simulating a needle insertion task
Experiments show that the perception of inserting a needle using the cutaneous-only force feedback is nearly indistinguishable from the one felt by the user while using both cutaneous and kinesthetic feedback
As most of the sensory substitution approaches, the proposed sensory subtraction technique also has the advantage of not suffering from stability issues of teleoperation systems due, for instance, to communication delays
Moreover, experiments show that the sensory subtraction technique outperforms sensory substitution with more conventional visual feedback (VF)
by justin
Today I finished reading “Gaston 12” by Andre Franquin
by justin
Today I finished reading “Gaston 11” by Andre Franquin
by justin
Today I finished reading “Gaston 10” by Andre Franquin
by justin
Today I finished reading “Gaston 9” by Andre Franquin
by justin
This week I am listening to “Mechanical Bull” by Kings Of Leon
by justin
Today I finished reading “Gaston 8” by Andre Franquin
by justin
Today I finished reading “Lucky Luke #9 – DES RAILS SUR LA PRAIRIE” by Rene Goscinny
by justin
Today I finished reading “Gaston 7” by Andre Franquin
by justin
Today I finished reading “The Walking Dead, Book Nine” by Robert Kirkman
by justin
Today I finished reading “Gaston 6” by Andre Franquin
by justin
Today I finished reading “Gaston 5” by Andre Franquin
by justin
Today I finished reading “Perfect Phrases for Sales Referrals: Hundreds of Ready-to-Use Phrases for Getting New Clients, Building Relationships, and Increasing Your Sales” by Jeb Brooks
by justin
Today I finished reading “Gaston 4” by Andre Franquin
by justin
This month I am studying “Developing iPhone web apps with ExtJS”
This is an online class with interactive instructor sessions that supposedly takes six months to complete. Sort of a boot camp for ExtJS on iPhone.
I spoke with the instructor and he states it takes an average student about six months to work through the material and a good student about three months.
Update: Wrapped up and submitted my last piece of coursework two days before the end of the first month.
Logged 56 hours of class time, study, reading, practice and interactive sessions with the instructor.
by justin
This week I am listening to “Random Access Memories” by Daft Punk
by justin
Today I finished reading “Gaston 3” by Andre Franquin
by justin
Today I finished reading “Gaston 2” by Andre Franquin
by justin
Today I finished reading “Gaston 1” by Andre Franquin
by justin
Today I read a paper titled “Accelerating Lossless Data Compression with GPUs”
The abstract is:
Huffman compression is a statistical, lossless, data compression algorithm that compresses data by assigning variable length codes to symbols, with the more frequently appearing symbols given shorter codes than the less
This work is a modification of the Huffman algorithm which permits uncompressed data to be decomposed into indepen- dently compressible and decompressible blocks, allowing for concurrent compression and decompression on multiple processors
We create implementations of this modified algorithm on a current NVIDIA GPU using the CUDA API as well as on a current Intel chip and the performance results are compared, showing favorable GPU performance for nearly all tests
Lastly, we discuss the necessity for high performance data compression in today’s supercomputing ecosystem
by justin
Today I read a paper titled “Your browsing behavior for a Big Mac: Economics of Personal Information Online”
The abstract is:
Most online services (Google, Facebook etc.) operate by providing a service to users for free, and in return they collect and monetize personal information (PI) of the users
This operational model is inherently economic, as the “good” being traded and monetized is PI
This model is coming under increased scrutiny as online services are moving to capture more PI of users, raising serious privacy concerns
However, little is known on how users valuate different types of PI while being online, as well as the perceptions of users with regards to exploitation of their PI by online service providers
In this paper, we study how users valuate different types of PI while being online, while capturing the context by relying on Experience Sampling
We were able to extract the monetary value that 168 participants put on different pieces of PI
We find that users value their PI related to their offline identities more (3 times) than their browsing behavior
Users also value information pertaining to financial transactions and social network interactions more than activities like search and shopping
We also found that while users are overwhelmingly in favor of exchanging their PI in return for improved online services, they are uncomfortable if these same providers monetize their PI
by justin
Today I finished reading “Lucky Luke #10 – Le Cavalier blanc” by Rene Goscinny
by justin
Today I finished reading “Lucky Luke #6 – Canyon Apache” by Rene Goscinny
by justin
This week I am listening to “Paramore” by Paramore
by justin
Today I finished reading “Death Note #5: Whiteout” by Tsugumi Ohba
by justin
I’m sorry you don’t like the words that I am using, but just because your vocabulary stopped developing when you left the playground doesn’t mean everyone else’s did.
by justin
Today I finished reading “Lucky Luke #9 – Le Grand duc” by Rene Goscinny
by justin
Today I finished reading “Death Note #4: Love” by Tsugumi Ohba
by justin
Today I finished reading “Lucky Luke – L’Integrale 18” by Morris
by justin
Today I read a paper titled “Casting Robotic End-effectors To Reach Faraway Moving Objects”
The abstract is:
In this article we address the problem of catching objects that move at a relatively large distance from the robot, of the order of tens of times the size of the robot itself
To this purpose, we adopt casting manipulation and visual-based feedback control
Casting manipulation is a technique to deploy a robotic end-effector far from the robot’s base, by throwing the end-effector and controlling its ballistic flight using forces transmitted through a light tether connected to the end-effector itself
The tether cable can then be used to retrieve the end- effector to exert forces on the robot’s environment
In previous work, planar casting manipulation was demon- strated to aptly catch static objects placed at a distant, known position, thus proving it suitable for applications such as sample acquisition and return, rescue, etc
In this paper we propose an extension of the idea to controlling the position of the end- effector to reach moving targets in 3D
The goal is achieved by an innovative design of the casting mechanism, and by closing a real-time control loop on casting manipulation using visual feedback of moving targets
To achieve this result, simplified yet accurate models of the system suitable for real-time computation are developed, along with a suitable visual feedback scheme for the flight phase
Effectiveness of the visual feedback controller is demonstrated through experiments with a 2D casting robot
by justin
Today I read a paper titled “Numerical Analysis of Diagonal-Preserving, Ripple-Minimizing and Low-Pass Image Resampling Methods”
The abstract is:
Image resampling is a necessary component of any operation that changes the size of an image or its geometry
Methods tuned for natural image upsampling (roughly speaking, image enlargement) are analyzed and developed with a focus on their ability to preserve diagonal features and suppress overshoots
Monotone, locally bounded and almost monotone “direct” interpolation and filtering methods, as well as face split and vertex split surface subdivision methods, alone or in combination, are studied
Key properties are established by way of proofs and counterexamples as well as numerical experiments involving 1D curve and 2D diagonal data resampling
In addition, the Remez minimax method for the computation of low-cost polynomial approximations of low-pass filter kernels tuned for natural image downsampling (roughly speaking, image reduction) is refactored for relative error minimization in the presence of roots in the interior of the interval of approximation and so that even and odd functions are approximated with like polynomials
The accuracy and frequency response of the approximations are tabulated and plotted against the original, establishing their rapid convergence
by justin
Today I read a paper titled “Mixing Board Versus Mouse Interaction In Value Adjustment Tasks”
The abstract is:
We present a controlled, quantitative study with 12 participants comparing interaction with a haptically enhanced mixing board against interaction with a mouse in an abstract task that is motivated by several practical parameter space exploration settings
The study participants received 24 sets of one to eight integer values between 0 and 127, which they had to match by making adjustments with physical or graphical sliders
Based on recorded slider motion path data, we developed an analysis algorithm that identifies and measures different types of activity intervals, including error time moving irrelevant sliders and end time in breaks after completing each trial item
Our results showed a significant increase in speed of the mixing board interaction accompanied by reduced perceived cognitive load when compared with the traditional mouse-based GUI interaction
The gains in speed are largely due to the improved times required for the hand to reach for the first slider (acquisition time) and also when moving between different ones, while the actual time spent manipulating relevant sliders is very similar for either input device
These results agree strongly with qualitative predictions from Fitts’ Law that the larger targets afforded by the mixer handles contributed to its faster performance
For further investigation, we computed a measure of motion simultaneity based on velocity correlation, which allowed to identify types of items for which increased simultaneous adjustments occur
For continuous parameter space exploration our findings suggest that mixing boards are a good option to provide detailed multi-value control
The strengths of this input method particularly show in settings where screen space is precious and undisrupted visual focus is crucial
by justin
This week I am listening to “Hesitation Marks” by Nine Inch Nails
by justin
Today I read a paper titled “On the Complexity of Smooth Spline Surfaces from Quad Meshes”
The abstract is:
This paper derives strong relations that boundary curves of a smooth complex of patches have to obey when the patches are computed by local averaging
These relations restrict the choice of reparameterizations for geometric continuity
In particular, when one bicubic tensor-product B-spline patch is associated with each facet of a quadrilateral mesh with n-valent vertices and we do not want segments of the boundary curves forced to be linear, then the relations dictate the minimal number and multiplicity of knots: For general data, the tensor-product spline patches must have at least two internal double knots per edge to be able to model a G^1-conneced complex of C^1 splines
This lower bound on the complexity of any construction is proven to be sharp by suitably interpreting an existing surface construction
That is, we have a tight bound on the complexity of smoothing quad meshes with bicubic tensor-product B-spline patches
by justin
Today I finished reading “Boule et bill globe trotters” by Jean Roba
by justin
Today I finished reading “The Complete Adventures of Tintin” by Herge