This week I am listening to “Maya” by M.I.A. (UK)
Archives for 2011
Paper – Object-Oriented Program Comprehension: Effect of Expertise, Task and Phase
Today I read a paper titled “Object-Oriented Program Comprehension: Effect of Expertise, Task and Phase”
The abstract is:
The goal of our study is to evaluate the effect on program comprehension of three factors that have not previously been studied in a single experiment.
These factors are programmer expertise (expert vs.
novice), programming task (documentation vs.
reuse), and the development of understanding over time (phase 1 vs.
phase 2).
This study is carried out in the context of the mental model approach to comprehension based on van Dijk and Kintsch’s model (1983).
One key aspect of this model is the distinction between two kinds of representation the reader might construct from a text: 1) the textbase, which refers to what is said in the text and how it is said, and 2) the situation model, which represents the situation referred to by the text.
We have evaluated the effect of the three factors mentioned above on the development of both the textbase (or program model) and the situation model in object-oriented program comprehension.
We found a four-way interaction of expertise, phase, task and type of model.
For the documentation group we found that experts and novices differ in the elaboration of their situation model but not their program model.
There was no interaction of expertise with phase and type of model in the documentation group.
For the reuse group, there was a three-way interaction between phase, expertise and type of model.
For the novice reuse group, the effect of the phase was to increase the construction of the situation model but not the program model.
With respect to the task, our results show that novices do not spontaneously construct a strong situation model but are able to do so if the task demands it.
Paper – Detecting and Tracking the Spread of Astroturf Memes in Microblog Streams
Today I read a paper titled “Detecting and Tracking the Spread of Astroturf Memes in Microblog Streams”
The abstract is:
Online social media are complementing and in some cases replacing person-to-person social interaction and redefining the diffusion of information.
In particular, microblogs have become crucial grounds on which public relations, marketing, and political battles are fought.
We introduce an extensible framework that will enable the real-time analysis of meme diffusion in social media by mining, visualizing, mapping, classifying, and modeling massive streams of public microblogging events.
We describe a Web service that leverages this framework to track political memes in Twitter and help detect astroturfing, smear campaigns, and other misinformation in the context of U.S.
political elections.
We present some cases of abusive behaviors uncovered by our service.
Finally, we discuss promising preliminary results on the detection of suspicious memes via supervised learning based on features extracted from the topology of the diffusion networks, sentiment analysis, and crowdsourced annotations.
Paper – Is the crowd’s wisdom biased? A quantitative asessment of three online communities
Today I read a paper titled “Is the crowd’s wisdom biased? A quantitative asessment of three online communities”
The abstract is:
This paper presents a study of user voting on three websites: Imdb, Amazon and BookCrossings.
It reports on an expert evaluation of the voting mechanisms of each website and a quantitative data analysis of users’ aggregate voting behavior.
The results suggest that voting follows different patterns across the websites, with higher barrier to vote introducing a more of one-off voters and attracting mostly experts.
The results also show that that one-off voters tend to vote on popular items, while experts mostly vote for obscure, low-rated items.
The study concludes with design suggestions to address the “wisdom of the crowd” bias.
Paper – On weakly optimal partitions in modular networks
Today I read a paper titled “On weakly optimal partitions in modular networks”
The abstract is:
Modularity was introduced as a measure of goodness for the community structure induced by a partition of the set of vertices in a graph.
Then, it also became an objective function used to find good partitions, with high success.
Nevertheless, some works have shown a scaling limit and certain instabilities when finding communities with this criterion.
Modularity has been studied proposing several formalisms, as hamiltonians in a Potts model or laplacians in spectral partitioning.
In this paper we present a new probabilistic formalism to analyze modularity, and from it we derive an algorithm based on weakly optimal partitions.
This algorithm obtains good quality partitions and also scales to large graphs.
Listening – Have One On Me
This week I am listening to “Have One On Me” by Joanna Newsom
Paper – The Good, the Bad, and the Ugly: three different approaches to break their watermarking system
Today I read a paper titled “The Good, the Bad, and the Ugly: three different approaches to break their watermarking system”
The abstract is:
The Good is Blondie, a wandering gunman with a strong personal sense of honor.
The Bad is Angel Eyes, a sadistic hitman who always hits his mark.
The Ugly is Tuco, a Mexican bandit who’s always only looking out for himself.
Against the backdrop of the BOWS contest, they search for a watermark in gold buried in three images.
Each knows only a portion of the gold’s exact location, so for the moment they’re dependent on each other.
However, none are particularly inclined to share…
Studying – Advanced content marketing
This month I am studying “Advanced content marketing”
I want to boost my personal marketing outreach so I am setting a goal of at least three months focus on acquiring more marketing skills.
My first class will be in content marketing run by a friend in Santa Barbara. I build him a sales website, he gives me access to his six month online content marketing course ware for free.
Update: That was a lot of fun. Lots of good video, and got some great feedback on my submitted work. I can immediately start applying that new content marketing knowledge going forward with my own marketing efforts.
Logged 25 hours of class time (video tutorials and interactive Skype sessions) and extra exercises.
Listening – Brothers
This week I am listening to “Brothers” by The Black Keys
Read – The Entrepreneur’s Guide to Customer Development
Today I finished reading “The Entrepreneur’s Guide to Customer Development: A Cheat Sheet to the Four Steps to the Epiphany” by Brant Cooper
Just because I did it for free doesn’t mean you can have it for free
I did some contract work for a “gentleman” many years ago who was having difficulty getting the performance he needed out of a homegrown, poorly implemented 3D rendering engine created by developers who had never built a 3D rendering engine before.
I had developed, as a separate personal project, a 3D rendering engine that would fit the bill, solve many of the problems we were suffering.
He wanted me to hand over the 3D rendering engine for free, with a perpetual, exclusive license.
He pleaded, he threatened, he cajoled, he tried to reason endlessly with me that I should give him the source code “because I wasn’t using it.”
He attempted to reason that because I was not using it for anything at the time, that I had no right to attempt to charge him for all of the hours I had put in to the work beforehand.
Paper – High Speed and Area Efficient 2D DWT Processor based Image Compression” Signal & Image Processing
Today I read a paper titled “High Speed and Area Efficient 2D DWT Processor based Image Compression” Signal & Image Processing”
The abstract is:
This paper presents a high speed and area efficient DWT processor based design for Image Compression applications.
In this proposed design, pipelined partially serial architecture has been used to enhance the speed along with optimal utilization and resources available on target FPGA.
The proposed model has been designed and simulated using Simulink and System Generator blocks, synthesized with Xilinx Synthesis tool (XST) and implemented on Spartan 2 and 3 based XC2S100-5tq144 and XC3S500E-4fg320 target device.
The results show that proposed design can operate at maximum frequency 231 MHz in case of Spartan 3 by consuming power of 117mW at 28 degree/c junction temperature.
The result comparison has shown an improvement of 15% in speed.
Listening – The Drums
This week I am listening to “The Drums” by The Drums
Paper – Multimedia Applications of Multiprocessor Systems-on-Chips
Today I read a paper titled “Multimedia Applications of Multiprocessor Systems-on-Chips”
The abstract is:
This paper surveys the characteristics of multimedia systems.
Multimedia applications today are dominated by compression and decompression, but multimedia devices must also implement many other functions such as security and file management.
We introduce some basic concepts of multimedia algorithms and the larger set of functions that multimedia systems-on-chips must implement.
Hulu needs my what now?
I don’t really watch much in the way of television these days. I’m more of an accidental watcher, encountering shows when they are on the TV whilst I am visiting friends. I’ll watch an occasional movie, but most of the stuff that Hollywood puts out bores me to death. I’m not a film snob, I couldn’t tell you what something means, or why a director picked a particular location or technique in his attempt to convey a certain message so it is not like I even watch independent movies specifically. I just find that life is too short to passively sit there and watch someone else’s entertainment product, there’s too many things I want to be doing to sit in front of a screen and not interact with it.
But hey, I’m trying to keep up on modern technology and various internet services, and sometimes I feel like catching up on something such as Good Eats or Mythbusters. I thought Hulu would be an ideal service to sign up for and get a semi-regular fix, they also offer a free 7-day trial so I can check it out for a week and cancel if I don’t like it or am not using it.
Off I trundled to the Hulu website, all ready to sign up, credit card in hand because that’s how these 7-day free trials often work. The registration form wants the standard stuff, email address, name. Okay, I can supply a fake name and made-up email address that will work for the purposes of this that can send all of their spam and marketing in to a black hole but still work for when I need to recover a password. But now it wants First Name and Last Name, hmm, okay, nothing too out of the ordinary there, and then it wants my birth date… um… why?
So the software can screen only age appropriate content?
No, probably not as anybody else could use my account or watch over my shoulder.
The web page goes on to require my zip code, not for billing purposes, I’m not even at the billing screen yet, this is purely for marketing purposes.
And finally, hey, Hulu wants to know gender too.
Amazing! What next year? Websites will need to know blood type or income level before letting you watch TV?
What does my gender or zip code or name have to do with signing up for a subscription based TV service? Oh, that’s right, so you can advertise and market to me under the guise of ensuring I receive “relevant” content. Gosh, they even have a message “we promise to always keep this information confidential” so that makes me instantly want to trust them more because if it’s on a company’s website, the company must obviously stand by everything they say they’ll do. Except for when it inconveniences them and makes it difficult to make money off of your marketing data.
So yeah, supplying lots of data to be able to watch TV which I don’t much care for anyway? No, I don’t think so. Nobody needs to know any more about the person signing up for a service than the billing address. If information requirements go beyond that, it is highly suspicious as to why a company is collecting the data.
It comes back to who gains the benefit when this data is supplied, me or the company? And in pretty much every case, it’s the company. If there is no direct, tangible benefit to the end-user, there is no reason to collect the data.
Read – Rich Dad’s Before You Quit Your Job
Today I finished reading “Rich Dad’s Before You Quit Your Job: 10 Real-Life Lessons Every Entrepreneur Should Know About Building a Multimillion-Dollar Business” by Robert T. Kiyosaki
Listening – Congratulations
This week I am listening to “Congratulations” by MGMT
Paper – Warping Peirce Quincuncial Panoramas
Today I read a paper titled “Warping Peirce Quincuncial Panoramas”
The abstract is:
The Peirce quincuncial projection is a mapping of the surface of a sphere to the interior of a square.
It is a conformal map except for four points on the equator.
These points of non-conformality cause significant artifacts in photographic applications.
In this paper, we propose an algorithm and user-interface to mitigate these artifacts.
Moreover, in order to facilitate an interactive user-interface, we present a fast algorithm for calculating the Peirce quincuncial projection of spherical imagery.
We then promote the Peirce quincuncial projection as a viable alternative to the more popular stereographic projection in some scenarios.
Paper – Virtual Reality
Today I read a paper titled “Virtual Reality”
The abstract is:
This paper is focused on the presentation of Virtual Reality principles together with the main implementation methods and techniques.
An overview of the main development directions is included.
Paper – Semantic Modeling and Retrieval of Dance Video Annotations
Today I read a paper titled “Semantic Modeling and Retrieval of Dance Video Annotations”
The abstract is:
Dance video is one of the important types of narrative videos with semantic rich content.
This paper proposes a new meta model, Dance Video Content Model (DVCM) to represent the expressive semantics of the dance videos at multiple granularity levels.
The DVCM is designed based on the concepts such as video, shot, segment, event and object, which are the components of MPEG-7 MDS.
This paper introduces a new relationship type called Temporal Semantic Relationship to infer the semantic relationships between the dance video objects.
Inverted file based index is created to reduce the search time of the dance queries.
The effectiveness of containment queries using precision and recall is depicted.
Keywords: Dance Video Annotations, Effectiveness Metrics, Metamodeling, Temporal Semantic Relationships.
Paper – Toward a general theory of quantum games
Today I read a paper titled “Toward a general theory of quantum games”
The abstract is:
We study properties of quantum strategies, which are complete specifications of a given party’s actions in any multiple-round interaction involving the exchange of quantum information with one or more other parties.
In particular, we focus on a representation of quantum strategies that generalizes the Choi-Jamio{\l}kowski representation of quantum operations.
This new representation associates with each strategy a positive semidefinite operator acting only on the tensor product of its input and output spaces.
Various facts about such representations are established, and two applications are discussed: the first is a new and conceptually simple proof of Kitaev’s lower bound for strong coin-flipping, and the second is a proof of the exact characterization QRG = EXP of the class of problems having quantum refereed games.
Read – The Grand Design
Today I finished reading “The Grand Design” by Stephen Hawking
Read – The Greatest Show on Earth
Today I finished reading “The Greatest Show on Earth: The Evidence for Evolution” by Richard Dawkins
Paper – Bottom-Up Earley Deduction
Today I read a paper titled “Bottom-Up Earley Deduction”
The abstract is:
We propose a bottom-up variant of Earley deduction.
Bottom-up deduction is preferable to top-down deduction because it allows incremental processing (even for head-driven grammars), it is data-driven, no subsumption check is needed, and preference values attached to lexical items can be used to guide best-first search.
We discuss the scanning step for bottom-up Earley deduction and indexing schemes that help avoid useless deduction steps..
Listening – Broken Bells
This week I am listening to “Broken Bells” by Broken Bells
Read – The Best of Larry Niven
Today I finished reading “The Best of Larry Niven” by Larry Niven
Paper – Pattern Recognition System Design with Linear Encoding for Discrete Patterns
Today I read a paper titled “Pattern Recognition System Design with Linear Encoding for Discrete Patterns”
The abstract is:
In this paper, designs and analyses of compressive recognition systems are discussed, and also a method of establishing a dual connection between designs of good communication codes and designs of recognition systems is presented.
Pattern recognition systems based on compressed patterns and compressed sensor measurements can be designed using low-density matrices.
We examine truncation encoding where a subset of the patterns and measurements are stored perfectly while the rest is discarded.
We also examine the use of LDPC parity check matrices for compressing measurements and patterns.
We show how more general ensembles of good linear codes can be used as the basis for pattern recognition system design, yielding system design strategies for more general noise models.
Paper – Optimizing Web Sites for Customer Retention
Today I read a paper titled “Optimizing Web Sites for Customer Retention”
The abstract is:
With customer relationship management (CRM) companies move away from a mainly product-centered view to a customer-centered view.
Resulting from this change, the effective management of how to keep contact with customers throughout different channels is one of the key success factors in today’s business world.
Company Web sites have evolved in many industries into an extremely important channel through which customers can be attracted and retained.
To analyze and optimize this channel, accurate models of how customers browse through the Web site and what information within the site they repeatedly view are crucial.
Typically, data mining techniques are used for this purpose.
However, there already exist numerous models developed in marketing research for traditional channels which could also prove valuable to understanding this new channel.
In this paper we propose the application of an extension of the Logarithmic Series Distribution (LSD) model repeat-usage of Web-based information and thus to analyze and optimize a Web Site’s capability to support one goal of CRM, to retain customers.
As an example, we use the university’s blended learning web portal with over a thousand learning resources to demonstrate how the model can be used to evaluate and improve the Web site’s effectiveness.
Studying – Building cross-platform games
This month I am studying “Building cross-platform games”
Have been building cross-platform games for both console, desktop and mobile for decades. Not expecting to learn much on this course, but it will be nice to relax and just follow along and do the class work and class projects with everyone else.
Update: The three day in-person class (six hours per day even though the class starts at 9AM and ends at 5PM you cannot really count lunch and the coffee breaks) I got 19 hours of class time and one-on-one time with the instructor.
Listening – Swim
This week I am listening to “Swim” by Caribou
Paper – An analysis of a random algorithm for estimating all the matchings
Today I read a paper titled “An analysis of a random algorithm for estimating all the matchings”
The abstract is:
Counting the number of all the matchings on a bipartite graph has been transformed into calculating the permanent of a matrix obtained from the extended bipartite graph by Yan Huo, and Rasmussen presents a simple approach (RM) to approximate the permanent, which just yields a critical ratio O($n\omega(n)$) for almost all the 0-1 matrices, provided it’s a simple promising practical way to compute this #P-complete problem.
In this paper, the performance of this method will be shown when it’s applied to compute all the matchings based on that transformation.
The critical ratio will be proved to be very large with a certain probability, owning an increasing factor larger than any polynomial of $n$ even in the sense for almost all the 0-1 matrices.
Hence, RM fails to work well when counting all the matchings via computing the permanent of the matrix.
In other words, we must carefully utilize the known methods of estimating the permanent to count all the matchings through that transformation.
Paper – The Labor Economics of Paid Crowdsourcing
Today I read a paper titled “The Labor Economics of Paid Crowdsourcing”
The abstract is:
Crowdsourcing is a form of “peer production” in which work traditionally performed by an employee is outsourced to an “undefined, generally large group of people in the form of an open call.” We present a model of workers supplying labor to paid crowdsourcing projects.
We also introduce a novel method for estimating a worker’s reservation wage–the smallest wage a worker is willing to accept for a task and the key parameter in our labor supply model.
It shows that the reservation wages of a sample of workers from Amazon’s Mechanical Turk (AMT) are approximately log normally distributed, with a median wage of $1.38/hour.
At the median wage, the point elasticity of extensive labor supply is 0.43.
We discuss how to use our calibrated model to make predictions in applied work.
Two experimental tests of the model show that many workers respond rationally to offered incentives.
However, a non-trivial fraction of subjects appear to set earnings targets.
These “target earners” consider not just the offered wage–which is what the rational model predicts–but also their proximity to earnings goals.
Interestingly, a number of workers clearly prefer earning total amounts evenly divisible by 5, presumably because these amounts make good targets.
Read – Just Listen
Today I finished reading “Just Listen: Discover the Secret to Getting Through to Absolutely Anyone” by Mark Goulston
Listening – Tourist History
This week I am listening to “Tourist History” by Two Door Cinema Club
GPU Cards Assemble!
Just got done assembling a nice little “super-computer” consisting of a bunch of NVIDIA GPUs in a rack mount case.
Power-on and self-test shows me… I have way too much computing power and a cooling problem. Looks like I am going to have to route some cooling ducts out of the server closet and through the wall to the outside.
I like this stack of h/w. I shall call it “BabyBlue” for the obvious play on words.
Paper – An Algebraic Approach for the MIMO Control of Small Scale Helicopter
Today I read a paper titled “An Algebraic Approach for the MIMO Control of Small Scale Helicopter”
The abstract is:
The control of small-scale helicopter is a MIMO problem.
To use of classical control approach to formally solve a MIMO problem, one needs to come up with multidimensional Root Locus diagram to tune the control parameters.
The problem with the required dimension of the RL diagram for MIMO design has forced the design procedure of classical approach to be conducted in cascaded multi-loop SISO system starting from the innermost loop outward.
To implement this control approach for a helicopter, a pitch and roll attitude control system is often subordinated to a, respectively, longitudinal and lateral velocity control system in a nested architecture.
The requirement for this technique to work is that the inner attitude control loop must have a higher bandwidth than the outer velocity control loop which is not the case for high performance mini helicopter.
To address the above problems, an algebraic design approach is proposed in this work.
The designed control using s-CDM approach is demonstrated for hovering control of small-scale helicopter simultaneously subjected to plant parameter uncertainties and wind disturbances.
Read – Power
Today I finished reading “Power: Why Some People Have it and Others Don’t” by Jeffrey Pfeffer
Listening – Public Strain
This week I am listening to “Public Strain” by Women
Paper – Discernment of Hubs and Clusters in Socioeconomic Networks
Today I read a paper titled “Discernment of Hubs and Clusters in Socioeconomic Networks”
The abstract is:
Interest in the analysis of networks has grown rapidly in the new millennium.
Consequently, we promote renewed attention to a certain methodological approach introduced in 1974.
Over the succeeding decade, this two-stage–double-standardization and hierarchical clustering (single-linkage-like)–procedure was applied to a wide variety of weighted, directed networks of a socioeconomic nature, frequently revealing the presence of “hubs”.
These were, typically–in the numerous instances studied of migration flows between geographic subdivisions within nations–“cosmopolitan/non-provincial” areas, a prototypical example being the French capital, Paris.
Such locations emit and absorb people broadly across their respective nations.
Additionally, the two-stage procedure–which “might very well be the most successful application of cluster analysis” (R.
C.
Dubes, 1985)–detected many (physically or socially) isolated, functional groups (regions) of areas, such as the southern islands, Shikoku and Kyushu, of Japan, the Italian islands of Sardinia and Sicily, and the New England region of the United States.
Further, we discuss a (complementary) approach developed in 1976, in which the max-flow/min-cut theorem was applied to raw/non-standardized (interindustry, as well as migration) flows.
Read – Agents For Games And Simulations
Today I finished reading “Agents For Games And Simulations: Trends In Techniques, Concepts And Design” by Frank Dignum
Read – I’m Tempted to Stop Acting Randomly
Today I finished reading “I’m Tempted to Stop Acting Randomly” by Scott Adams
Conceit
Apparently I am full of conceit.
A friend asked my fiancee and I to spend four days helping with the on-site organization and catering of their wedding.
They didn’t want a gift, just some time and help.
We were happy to oblige, though I could only dedicate two days to the endeavour due to work committments.
After about $500 of expense (travel, pet sitter for cats, kennel for dog) I dutifully showed up the day beforehand.
On the day of the wedding I was approached and asked “Did I bring a gift?”
“I did not.” I responded.
“It’s a bit much to show up to a wedding without a gift.” I was chastised.
“I paid my own way here, from another state, my wife has dedicated several days to on-site organizing, I will be performing many menial chores and some of the post-wedding cooking. I think that is gift enough.” I explained.
“You should bring a real gift next time.”
“I am the gift.” I responded.
“You’re so conceited.” was declared with much derision.
Listening – Forgiveness Rock Record
This week I am listening to “Forgiveness Rock Record” by Broken Social Scene
Read – Conan #8: Black Colossus
Today I finished reading “Conan #8: Black Colossus” by Timothy Truman
Read – Thermodynamics Demystified
Today I finished reading “Thermodynamics Demystified” by Merle Potter
Paper – Fractal Basins and Boundaries in 2D Maps inspired in Discrete Population Models
Today I read a paper titled “Fractal Basins and Boundaries in 2D Maps inspired in Discrete Population Models”
The abstract is:
Two-dimensional maps can model interactions between populations.
Despite their simplicity, these dynamical systems can show some complex situations, as multistability or fractal boundaries between basins that lead to remarkable pictures.
Some of them are shown and explained here for three different 2D discrete models.
Read – Maximum Ride #2
Today I finished reading “Maximum Ride #2” by James Patterson
Read – Agatha Heterodyne and the Airship City
Today I finished reading “Agatha Heterodyne and the Airship City” by Phil Foglio
Listening – The Courage Of Others
This week I am listening to “The Courage Of Others” by Midlake
Read – Mastering the VC Game
Today I finished reading “Mastering the VC Game: A Venture Capital Insider Reveals How to Get from Start-up to IPO on Your Terms” by Jeffrey Bussgang