Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

December 27, 2011

Thinking, Fast and Slow

Thinking, Fast and Slow by Daniel Kahneman, Farrar, Straus and Giroux, New York, 2011.

I got a copy of “Thinking, Fast and Slow” for Christmas and it has already proven to be an enjoyable read.

Kahneman says early on (page 28):

The premise of this book is that it is easier to recognize other people’s mistakes than our own.

I thought about that line when I read a note from a friend that topic maps needed more than my:

tagging everything with “Topic Maps….”

Which means I haven’t been clear about the reasons for the breath of materials I have and will be covering in this blog.

One premise of this blog is that the use and recognition of identifiers is essential for communication.

Another premise of this blog is that it is easier for us to study the use and recognition of identifiers by others, much for the same reasons we can recognize the mistakes of others more easily.

The use and recognition of identifiers by others aren’t mistakes but they may be different from those we would make. In cases where they differ from ours, we have a unique opportunity to study the choices made and the impacts of those choices. And we may learn patterns in those choices that we can eventually see in our own choices.

Understanding the use and recognition of identifiers in a particular circumstance and the requirements for the use and recognition of identifiers, is the first step towards deciding whether topic maps would be useful in some circumstance and in what way?

For example, processing social security records in the United States, anything other than “bare” identifiers like a social security number may be unnecessary and add load with no corresponding benefit. Aligning social security records with bank records, might need to reconsider the judgement to use only social security numbers. (Some information sharing is “against the law.” But as the Sheriff in “Oh Brother where art thou?” says: “The law is a man made thing.” Laws change, or you can commission absurdist interpretations of it.)

Topic maps aren’t everywhere but identifiers and recognition of identifiers are.

Understanding identifiers and their recognition will help you choose the most appropriate solution to a problem

December 18, 2011

The best way to get value from data is to give it away

Filed under: Data,Marketing — Patrick Durusau @ 8:49 pm

The best way to get value from data is to give it away from the Guardian.

From the article:

Last Friday I wrote a short piece on for the Datablog giving some background and context for a big open data big policy package that was announced yesterday morning by Vice President Neelie Kroes. But what does the package contain? And what might the new measures mean for the future of open data in Europe?

The announcement contained some very strong language in support of open data. Open data is the new gold, the fertile soil out of which a new generation of applications and services will grow. In a networked age, we all depend on data, and opening it up is the best way to realise its value, to maximise its potential.

There was little ambiguity about the Commissioner’s support for an ‘open by default’ position for public sector information, nor for her support for the open data movement, for “those of us who believe that the best way to get value from data is to give it away“. There were props to Web Inventor Tim Berners-Lee, the Open Knowledge Foundation, OpenSpending, WheelMap, and the Guardian Datablog, amongst others.

Open government data at no or low cost, represents a real opportunity for value-add data vendors. Particularly those using topic maps.

Topic maps enable the creation of data products that can be easily integrated with data products created from different perspectives.

Not to mention reuse of data analysis to create new products to respond to public demand.

For example, after the recent misfortunes with flooding and nuclear reactors in Japan, there was an upsurge of interest in the safety of reactors in other countries. The information provided by news outlets was equal parts summary and reassurance. A data product that mapped together known issues with the plants in Japan, their design, inspection reports on reactors in some locale, plus maps of their locations, etc., would have found a ready audience.

Creation of a data product like that, in time to catch the increase in public interest, would depend on prior analysis of large amounts public data. Analysis that could be re-used for a variety of purposes.

December 17, 2011

P2PU

Filed under: Education,Marketing,Topic Maps — Patrick Durusau @ 7:53 pm

P2PU

From the website:

At P2PU, people work together to learn a particular topic by completing tasks, assessing individual and group work, and providing constructive feedback.

I just ran across the site today but was wondering if anyone else has used it or something similar? In order to grow the usage of topic maps, some sort of classes need to appear on a regular basis. Ones that are more widely available that graduate courses at some institutions.

Good idea? Bad idea? Comments?

December 15, 2011

SQL Database-as-a-Service

Filed under: Marketing,PostgreSQL,Topic Maps — Patrick Durusau @ 7:44 pm

SQL Database-as-a-Service

Documentation

Just starting the documentation but two quick thoughts:

First, most conventionally, this could be the back-end to a topic map server. Despite having started off many years ago in server administration (or perhaps because of it), server configuration/management isn’t an “additional” duty for any mission critical development/support staff. Too easy to hire server management services, who are capable of providing maintenance/support that no small firm could afford locally.

Second, a bit more unconventionally, this could be an illustration for a Topic-Map-As-Service. Think about it. If instead of the mish-mash that is Wikipedia, you had a topic maps of facts that were supplemented (read mapped) to records from various public reporting agencies, that could be interesting to “press” a local topic map against to acquire more recent data.

True, there are the public record services but they only give you person by person records and not relationships between them. Not to mention that if you are inventive, you could create some very interesting topic maps (intersections of records).

Imagine the stir that a topic map of license plates of cars with local plates at motels would cause. Rather than offering free access, since most people would only be interested in one license plate in particular, suggest that you show one plate at some random time each hour and where it was seen. (not the date) Sell advertising for the page where you offer the “free” sneak peak. Suspect you better have some load expansion capacity.

Ambiguity in the Cloud

Filed under: Cloud Computing,Marketing,Topic Maps,Uncategorized — Patrick Durusau @ 7:43 pm

If you are interested at all in cloud computing and its adoption, you need to read US Government Cloud Computing Technology Roadmap Volume I Release 1.0 (Draft). I know, a title like that is hardly inviting. But read it anyway. Part of a three volume set, for the other volumes see: NIST Cloud Computing Program.

Would you care to wager on out of ten (10) requirements, how many cited a need for interoperability that is presently lacking due to different understandings, terminology, in other words, ambiguity?

Good decision.

The answer? 8 out of 10 requirements cited by NIST have interoperability as a component.

The plan from NIST is to develop a common model, which will be a useful exercise, but how do we discuss differing terminologies until we can arrive at a common one?

Or allow for discussion of previous SLAs, for example, after we have all moved onto a new terminology?

If you are looking for a “hot” topic that could benefit from the application of topic maps (as opposed to choir programs at your local church during the Great Depression) this could be the one. One of those is a demonstration of a commercial grade technology, the other is at best a local access channel offering. You pick which is which.

December 10, 2011

Whentotweet.com – Twitter analytics for the masses

Filed under: Marketing — Patrick Durusau @ 8:10 pm

Whentotweet.com – Twitter analytics for the masses

From the post:

Twitter handles an amazing number of Tweets – over 200 million tweets are sent per day.

We saw that many Twitter users were tweeting interesting content but much of it was lost in the constant stream of tweets.

Whentotweet.com is born

While there were many tools for corporate Twitter users that performed deep analytics and provided insight into their tweets, there were none that answered the most basic question: what time of the day are my followers actually using Twitter?

And so the idea behind Whentotweet was born. In its current form, Whentotweet analyzes when your followers tweet and gives you a personalized recommendation of the best time of day to tweet to reach as many as possible.

I mention this in part so that you may become better at getting your messages about topic maps out over Twitter.

An equally pragmatic reason is that the success of topic maps depends on the identification of use cases that will seem perfectly natural once you suggest them. Take this site/service as an example of meeting a need that is “obvious” once someone pointed it out.

Try it at: www.whentotweet.com

December 9, 2011

Twenty Rules for Good Graphics

Filed under: Graphics,Marketing — Patrick Durusau @ 8:23 pm

Twenty Rules for Good Graphics

Rob J Hyndman outlines twenty (20) rules for production of good graphics.

Written for graphics in statistical publications but applicable to other graphics as well.

Communicating topic maps to others is hard enough without the burden of poor graphics.

November 30, 2011

NPR’s radio series on Big Data

Filed under: BigData,Marketing — Patrick Durusau @ 8:13 pm

NPR’s radio series on Big Data by David Smith.

Public-radio network NPR has just broadcast a 2-part series about Big Data on its Morning Edition program. For anyone with 10 minutes to spare, it’s a great overview of the impact of Big Data and the data scientists who derive value from the data. Part 1 is about companies that make use of Big Data, and the implications for businesses and individuals. Part 2 is about the demand for data scientists to analyze big data. (Key quote: “Math and Statistics are the are the sexiest skills around”.) You can listen to both segments online at the links below.

NPR: Following Digital Breadcrumbs To ‘Big Data’ Gold ; The Search For Analysts To Make Sense Of ‘Big Data’

From Revolution Analytics, a great place to hang out for R and other news.

You would have to work for NPR to think: “Math and Statistics are the are the sexiest skills around” 😉

Seriously, the demand for making sense out of the coming flood of data (you haven’t seen anything yet) is only going to increase. All the “let’s stop while we take 6 months to analyze a particular data set” type solutions are going to be swept away. Analysis is going to be required, but on a cost-benefit basis. And one of the benefits isn’t going to be “works with your software.”

Ad Hoc Normalization II

Filed under: Marketing,Normalization,SQL,Topic Maps — Patrick Durusau @ 8:09 pm

After writing Ad Hoc Normalization it occurred to me that topic maps offer another form of “ad hoc” normalization.

I don’t know what else you would call merging two topic maps together?

Try that with two relational databases.

So, not only can topic maps maintain “internal” ad hoc normalization but also “external” ad hoc normalization with data sources that were not present at the time of the creation of a topic map.

But there are other forms of normalization.

Recall that Lars Marius talks about the reduction of information items that represent the same subjects. That can only occur when there is a set of information items that obey the same data model and usually the same syntax. I would call that information model normalization. That is whatever is supported by a particular information model can be normalized.

For relational databases that is normalization by design and for topic maps that is ad hoc normalization (although some of it could be planned in advance as well).

But there is another form of normalization. A theoretical construct but subject-based normalization. I say it is theoretical because in order to instantiate a particular case you have to cross over into the land of information model normalization.

I find subject-based normalization quite useful, mostly because as human designers/authors, we are not constrained by the limits of our machines. We can hold contradictory ideas at the same time without requiring a cold or hot reboot. Subject-based normalization allows us to communicate with other users what we have seen in data and how we need to process it for particular needs.

November 29, 2011

Ad Hoc Normalization

Filed under: Marketing,Normalization,SQL,Topic Maps — Patrick Durusau @ 8:46 pm

I really should not start reading Date over the weekend. It puts me in a relational frame of mind and I start thinking of explanations of topic maps in terms of the relational model.

For example, take his definition of:

First normal form: A relvar is in 1NF if and only if, in every legal value of that relvar, every tuple contains exactly one value for each attribute. (page 358)

Second normal form: (definition assuming only one candidate key, which we assume is the primary key): a relvar is in 2NF if and only if it is in 1NF and every nonkey attribute is irreducibly dependent on the primary key. (page 361)

Third normal form: (definition assuming only one candidate key, which we assume is the primary key): A relvar is in 3NF if and only if it is in 2NF and every nonkey attribute is nontransitively dependent on the primary key. (page 363)

Third normal form (even more informal definition): A relvar is in third normal form (3NF) if and only if, for all time, each tuple consists of a primary key value that identifies some entity, together with a set of zero or more mutually independent values that describe that entity in some way.

Does that mean that topic maps support ad hoc normalization? That is we don’t have to design in normalization before we start writing the topic map but can decide on what subjects need to be “normalized,” that is represented by topics that read to a single representative, after we have started writing the topic map.

Try that with a relational database and tables of any complexity. If you don’t get it right at the design stage, fixing it becomes more expensive as time goes by.

Not a “dig” at relational databases. If your domain is that slow changing and other criteria point to a relational solution, by all means, use one. Performance numbers are hard to beat.

On the other hand, if you need “normalization” an yet you have a rapidly changing environment that is subject to exploration and mappings across domains, you should give topic maps a hard look. Ask for “Ad Hoc Normalization” by name. 😉

PS: I suspect this is what Lars Marius meant by Topic Maps Data Model (TMDM) 6. Merging, 6.1 General:

A central operation in Topic Maps is that of merging, a process applied to a topic map in order to eliminate redundant topic map constructs in that topic map. This clause specifies in which situations merging shall occur, but the rules given here are insufficient to ensure that all redundant information is removed from a topic map.

Any change to a topic map that causes any set to contain two information items equal to each other shall be followed by the merging of those two information items according to the rules given below for the type of information item to which the two equal information items belong.

But I wasn’t “hearing” “…eliminate redundant topic maps constructs…” as “normalization.”

November 23, 2011

Black Duck Software Joins GENIVI Alliance

Filed under: Marketing,Open Source,Topic Maps — Patrick Durusau @ 7:44 pm

Black Duck Software Joins GENIVI Alliance

From the post:

Black Duck Software, the leader in open source software knowledge, adoption and governance, today announced it has joined the GENIVI Alliance as an Associate Member. Black Duck will work with the GENIVI Alliance to provide open source compliance strategy, program development and training to Alliance members, which include top automakers and automotive software suppliers.

The GENIVI Alliance is an automotive and consumer electronics industry association driving the development and adoption of an open in-vehicle infotainment (IVI) reference platform. Among the Alliance’s goals are the delivery of a reusable, open source IVI platform consisting of Linux-based core services, middleware and open application layer interfaces; development and support of an open source community of IVI developers; and training and support programs to help software developers create compliant IVI applications.

I would think that infotainment for vehicles would need topic maps as much as any other information stream.

Not to mention that getting on the inside track with someone like Black Duck could not hurt topic maps. 😉

More from the post:

About Black Duck Software

Black Duck Software is the leading provider of strategy, products and services for automating the management, governance and secure use of open source software, at enterprise scale, in a multi-source development process. Black Duck enables companies to shorten time-to-solution and reduce development costs while mitigating the management, compliance and security challenges associated with open source software. Black Duck Software powers Koders.com, the industry’s leading code search engine for open source, and Ohloh.net, the largest free public directory of open source software and a vibrant web community of free and open source software developers and users. Black Duck is among the 500 largest software companies in the world, according to Softwaremag.com. For more information, visit www.blackducksoftware.com.

About GENIVI Alliance

GENIVI Alliance is a non-profit industry association whose mission is to drive the broad adoption of an In-Vehicle Infotainment (IVI) open source development platform. GENIVI will accomplish this by aligning requirements, delivering reference implementations, offering certification programs and fostering a vibrant open source IVI community. GENIVI’s work will result in shortened development cycles, quicker time-to-market, and reduced costs for companies developing IVI equipment and software. GENIVI is headquartered in San Ramon, Calif. www.genivi.org.

Do bear in mind that koders.com searches about 3.3+ billion lines of open source code. I am sure you can think of ways topic maps could improve that search experience.

November 17, 2011

Mindbreeze Picks Up Where SharePoint Leaves Off

Filed under: Marketing,Topic Maps — Patrick Durusau @ 8:39 pm

Mindbreeze Picks Up Where SharePoint Leaves Off

From the post:

SharePoint 2010 is a widely implemented application, but not one that solves every solution. The issue is explored further in, “SharePoint 2010 collaboration ISVs focus on workflow, analytics.” The author, Jonathan Gourlay, reports that users are increasingly relying on a number of independent software vendors to plug the holes in the service that SharePoint provides.

Mark Gilbert, lead analyst for Gartner Research had this to say:

“’Just because SharePoint is a lot of stuff, it doesn’t mean it’s all good stuff, but a lot of it is,’ said Gilbert, who estimates he’s spoken to 3,000 companies about SharePoint. He compares the platform to a Swiss Army Knife that allows the user to add tools. ‘To make [SharePoint] a real enterprise-class tool, you typically have to pay a lot of attention to the care and feeding of it and you have to add a lot of third-party tools.’”

Here’s the main question: if SharePoint is being advertised as enterprise-class, why do so many users need ISVs to bring it up to that level? The article goes on to argue that the opportunity for vendors to build upon the SharePoint platform is huge.

We argue that one smart and agile solution could single-handedly solve an organization’s enterprise and SharePoint woes. Fabasoft Mindbreeze is getting good feedback regarding its suite of solutions.

I must admit I will sleep easier tonight knowing that:

SharePoint 2010 is a widely implemented application, but not one that solves every solution.

As long as SharePoint 2010 trys to solve problems, we may stand a chance. 😉

Seriously, I don’t think you have to go very far to find enterprise level solutions by people who work in the .Net world. If it were me, I would ring up Networked Planet, whose website isn’t being rebuilt so no apologies are necessary. (Disclosure: I don’t work for Networked Planet but I do know both of its founders.)

This is another example of where the practice of topic maps can solve real world problems. If you have every used any version of SharePoint, then you know what it means to have problems in need of solutions. Fortunately for you, you don’t have to learn topic maps or even hear the term to enjoy a solution to the problems SharePoint poses.

Apps for Science Challenge

Filed under: Marketing,SciVerse — Patrick Durusau @ 8:38 pm

SciVerse held a challenge recently on apps for science. Two out of the three top place finishers had distinctly topic map like features.

Altmetric – First place: Reads in part:

Once the Altmetric app is installed you’ll notice a new ‘Altmetric’ box appear in the sidebar whenever you search on the SciVerse Hub. It’ll show you the articles in the first few pages of your search results that your peers and the general public have been talking about online; if you prefer you can choose to only see articles from the page of results that you’re currently on. You’ll also see some basic information about how and where articles are being discussed underneath the search results themselves.

Refinder – Second place: Reads in part:

When you found the right papers on SciVerse, bring them together with Refinder. Refinder is an intelligent online collaboration tool for teams. Scientists are using it to collect papers, research notes, and more information in one place. Once collected, important facts about documents can be added as comments. By using links, related things are connected. When reading an article in SciVerse, an intelligent algorithm automatically searches and suggests relevant collections, topics, documents, or experts from Refinder.

Teams love it. Shared collections are provided for each team. They are individually configured by inviting members and setting access rights. Teams use collections to share articles, ideas, dicuss topics, ask questions and get answers. Organizations can use Refiner both internally and externally, a useful feature to communicate with partners in projects.

Sounds a lot like a topic map doesn’t it? Except that they have a fairly loose authoring model, which is probably a good thing in some cases. Don’t know if the relations between things are typed or if they have some notion of identity.

iHelp – Third place: Reads in part:

iHelp enables researchers to do search in their native languages. Articles with different languages are retrieved using this multi-lingual search. Option is provided for phonetic typing of search text. User can either do native search (the typed language) or translate and search in English.

I assume talking about full text search but at least attempting to do that across languages. Suspect it has all the issues of full text plus the perils of mechanized translation. Still, if the alternative is no searching at all, this must seem pretty good.

All of these applications represent some subset of what topic maps are about, ranging from subjects being described in different languages to being able to easily collaborate with others or discover other characteristics of a work, such as its popularity.

Offering some modest improvement over current interfaces, improvements that fall far short of the capabilities of topic maps, seem to attract a fair amount of interest. Something for nay sayers about improved information applications to keep in mind.

November 16, 2011

Data Integration Remains a Major IT Headache

Filed under: Data Integration,Marketing — Patrick Durusau @ 2:13 pm

Data Integration Remains a Major IT Headache

From the webpage:

Click through for results from a survey on data integration, conducted by BeyeNetwork on behalf of Syncsort.

…. (with regard to data integration tools)

In particular, the survey makes it clear that not only is data integration still costly, a lot of manual coding is required. The end result is that the fundamentals of data integration are still a big enough issue in most IT organizations to thwart the achievement of strategic business goals.

Complete with bar and pie charts! 😉

If data integration is a problem in the insular data enclaves of today, do you think data integration will get easier when foreign big data comes on the scene?

That’s what I think too.

I will ask BeyeNetwork if they asked this question:

How much manual coded data was the subject of manual coding before?

Or perhaps better:

Where did coders get the information for repeated manual coding of the data? (with follow up questions based on the responses to refine that further)

Reasoning that how we maintain information about data (read metadata) can have an influence on the cost of manual coding, i.e., discovery of what the data means (or is thought to mean).

It isn’t possible to escape manual coding, at least if we want useful data integration. We can, however, explore how to make manual coding less burdensome.

I say we can’t escape manual coding because unless by happenstance two data sets shared the same semantics, I am not real sure how they would be integrated sight unseen with any expectation of a meaningful result.

Or to put it differently, meaningful data integration efforts, like lunches, are not free.

PS: And you thought I was going to say topic maps were the answer to data integration headaches. 😉 Maybe, maybe, depends on your requirements.

You should never buy technology or software because of its name, everyone else is using it, your boss saw it during a Super Bowl half-time show, or similar reasons. I am as confident that topic maps will prove to be the viable solution in some cases as I am that other solutions are more appropriate in others. Topic mappers should not be afraid to say so.

November 13, 2011

Developing a predictive analytics program doable on a limited budget

Filed under: Marketing,Prediction — Patrick Durusau @ 10:00 pm

Developing a predictive analytics program doable on a limited budget

From the post:

Predictive analytics is experiencing what David Menninger, a research director and vice president at Ventana Research Inc., calls “a renewed interest.” And he’s not the only one who is seeing a surge in the number of organizations looking to set up a predictive analytics program.

In September, Hurwitz & Associates, a consulting and market research firm in Needham, Mass., released a report ranking 12 predictive analytics vendors that it views as “strong contenders” in the market. Fern Halper, a Hurwitz partner and the principal researcher for the report, thinks predictive analytics is moving into the user mainstream. She said its growing popularity is being driven by better tools, increased access to high-performance computing resources, reduced storage costs and an economic climate that has businesses hungry for better forecasting.

“Especially in today’s economy, they’re realizing they can’t just look in the rearview mirror and look at what has happened,” said Halper. “They need to look at what can happen and what will happen and become as smart as they can possibly be if they’re going to compete.”

While predictive analytics basks in the limelight, the nuances of developing an effective program are tricky and sometimes can be overwhelming for organizations. But the good news, according to a variety of analysts and consultants, is that finding the right strategy is possible — even on a shoestring budget.

Here are some of their best-practices tips for succeeding on predictive analytics without breaking the bank:

What caught my eye was doable on a limited budget.

Limited budgets aren’t uncommon most of the time and in today’s economy they are down right plentiful. In private and public sectors.

The lessons in this post apply to topic maps. Don’t try to sell converting an entire enterprise or operation to topic maps. Pick some small area of pain or obvious improvement and sell a solution for that part. ROI that they can see this quarter or maybe next. Then build on that experience to propose larger or longer range projects.

November 12, 2011

Misunderstanding Creates Super-Bugs

Filed under: Marketing,Visualization — Patrick Durusau @ 8:35 pm

Misunderstood terminology between doctors and their patients contributes to the evolution of resistant bacteria. That is bacteria that will resist medical treatment. Further translation: You could die.

Colin Purrington has a great graphic in Venn guide to pills that kill things that explains the difference between antibiotic, antibacterial, antifungal, antiviral, etc. What most doctors mean is antibacterial but they don’t say so.

Knowing what you doctor means is a good thing. Same is true for effective data processing.

November 10, 2011

HowTo.gov

Filed under: Marketing — Patrick Durusau @ 6:43 pm

HowTo.gov

A window into best information practices for U.S. government agencies. Shape your sales pitch to match those practices.

The General Services Administration (GSA) (well actually, GSA’s Office of Citizen Services & Innovative Technologies) sponsors the site.

I haven’t encountered anything earth shaking or new but it is an attractive site that is designed to assist agency staff with questions about how to better deliver information from or about their agencies.

Definitely a site I would pass along to state, local as well as federal agencies. They will benefit from the information it contains and it will give you a jumping off point for discussion of how you can assist with their information needs.

November 9, 2011

B2B Blog Strategy | Ten Be’s of The Best B2B Blogs

Filed under: Business Intelligence,Marketing,Topic Maps — Patrick Durusau @ 7:43 pm

B2B Blog Strategy | Ten Be’s of The Best B2B Blogs

Joel York writes:

Blogging is one of the easiest, cheapest and most effective ways to engage the New Breed of B2B Buyer, yet so many B2B blogs miss the mark. Here are ten “be’s” of the best b2b blogs. It isn’t the first top ten list of best B2B blog secrets, and no doubt it will not be the last. But, it is mine and it’s what I personally strive for Chaotic Flow to be.

Joel’s advice will work for topic map blogs as well.

People are not going to find out about topic maps unless we continue to push information about topic maps out into the infosphere. Blogging is one aspect of pushing information. Tweeting is another. Publication of white papers, software and other materials is another.

The need for auditable, repeatable, reliable consolidation (if you don’t like the merging word) of information from different sources is only growing with the availability of more data on the Internet. I think topic maps has a role to play there. Do you?

November 7, 2011

Challenge.gov

Filed under: Contest,Marketing — Patrick Durusau @ 7:27 pm

Challenge.gov

From the FAQ:

About challenges

What is a challenge?

A government challenge or contest is exactly what the name suggests: it is a challenge by the government to a third party or parties to identify a solution to a particular problem or reward contestants for accomplishing a particular goal. Prizes (monetary or non–monetary) often accompany challenges and contests.

Challenges can range from fairly simple (idea suggestions, creation of logos, videos, digital games and mobile applications) to proofs of concept, designs, or finished products that solve the grand challenges of the 21st century. Find current federal challenges on Challenge.gov.

About Challenge.gov

Why would the government run a challenge?

Federal agencies can use challenges and prizes to find innovative or cost–effective submissions or improvements to ideas, products and processes. Government can identify the goal without first choosing the approach or team most likely to succeed, and pay only for performance if a winning submission is submitted. Challenges and prizes can tap into innovations from unexpected people and places.

Hard to think of better PR for topic maps than being the solution to one or more of these challenges.

If you know of challenges in other countries or by other organizations, please post or email pointers to them.

November 6, 2011

Clive Thompson on Why Kids Can’t Search (Wired)

Filed under: Interface Research/Design,Marketing,Searching — Patrick Durusau @ 5:44 pm

Clive Thompson on Why Kids Can’t Search (Wired)

From the post:

We’re often told that young people tend to be the most tech-savvy among us. But just how savvy are they? A group of researchers led by College of Charleston business professor Bing Pan tried to find out. Specifically, Pan wanted to know how skillful young folks are at online search. His team gathered a group of college students and asked them to look up the answers to a handful of questions. Perhaps not surprisingly, the students generally relied on the web pages at the top of Google’s results list.

But Pan pulled a trick: He changed the order of the results for some students. More often than not, those kids went for the bait and also used the (falsely) top-ranked pages. Pan grimly concluded that students aren’t assessing information sources on their own merit—they’re putting too much trust in the machine.

I agree with the conclusion but would add it illustrates a market for topic maps.

A market to deliver critically assessed information as opposed to teaching people to critically assess information. Critical assessment of information, like tensor calculus, can be taught, but how many people are capable of learning/applying it?

Take a practical example (US centric) of the evening news. Every night, for an hour, the news is mostly about murders, fatal accidents, crimes of other sorts, etc. So much so that personal security is a concern for most Americans and they want leaders who are tough on crime, terrorism, etc. Err, but crime rates, including violent crime have been falling for the last decade. They are approaching all time lows.

As far as terrorism, well, that is just a bogeyman for security and military budgets. Yes, 9/11, but 9/11 isn’t anything like having monthly or weekly suicide bombers is it? American are in more danger from the annual flu, medical malpractice, drunk drivers, heart disease, and a host of other causes more than terrorism. The insurance companies admit to 100,000 deaths a year to medical “misadventure.” How many Americans died from terrorist attacks last year in the U.S.? That would be the “naught” or “0” as I was taught.

I suppose my last point, about terrorism, brings up another point about “critical assessment” of information for topic maps. Depends on what your client thinks is “critical assessment.” If you are doing a topic map for the terror defense industry, I would suggest skipping my comparisons with medical malpractice. Legislative auditors, on the other hand, might appreciate a map of expenditures and results, which for the US Department of Homeland Security, would have a naught in the results column. Local police and the FBI, traditional law enforcement agencies, have been responsible for the few terrorist arrests since 9/11.

I read somewhere a long time ago that advertisers think of us as: “Insecure, sex-starved neurotics with attention spans of about 15 seconds.” I am not sure how to codify that as rules for content and interface design but it is a starting point.

I say all that to illustrate that critical assessment of information isn’t a strong point for the general population (or some of its leaders for that matter). Not just kids but their parents, grandparents, etc.

We may as well ask: Why people can’t critically assess information?

I don’t know the answer to that but I think the evidence indicates it is a rare talent, statistically speaking. And probably varies by domain. People who are capable of critical assessment in one domain may not be capable of it in another.

So, if it is a rare talent, statistically speaking, like hitting home runs, let’s market the ability to critically assess information.

November 5, 2011

What Market Researchers could learn from eavesdropping on R2D2

Filed under: Machine Learning,Marketing — Patrick Durusau @ 6:41 pm

What Market Researchers could learn from eavesdropping on R2D2

From the post:

Scott asks: in the context of research and insight, why should we care about what the Machine Learning community is doing?

For those not familiar with Machine Learning, it is a scientific discipline related to artificial intelligence. But it is more concerned with the science of teaching machines to solve useful problems as opposed to trying to get machines to replicate human behavior. If you were to put it in Star Wars terms, a Machine Learning expert would be more focused on building the short, bleeping useful R2D2 than the shiny, linguistically gifted but clumsy C3P0—a machine that is useful and efficient as opposed to a machine that replicates behaviors and mannerisms of humans.

There are many techniques and approaches that marketing insights consultants could borrow from the Machine Learning community. The community is made up of a larger group of researchers and scientists as well as those concerned with market research, and their focus is improving algorithms that can be applied across a wide variety of scientific, technology, business, and engineering problems. And so it is a wonderful source of inspiration for approaches that can be adapted to our own industry.

Since topic mappers aren’t large enough to be the objects of study (yet), I thought this piece on how marketers view the machine learning community might be instructive.

Successful topic mappers will straddle semantic communities and to do that, they need to be adept at what I would call “semantic cross-overs.”

Semantic cross-overs are those people and written pieces that give you a view that over arches two or more communities. Almost always written more from one point of view than another, but enough of both to give you ideas that may spark in both camps.

Remember, crossing over between two communities isn’t your view of the cross-over, but that of members of the respective communities. In other words, your topic map between them may seem very clever to you, but unless it is clever to members of those communities, we call it: No Sale!

Spy vs. Spy

Filed under: Marketing,Topic Maps — Patrick Durusau @ 6:40 pm

I mentioned in Google+ Ripples: Revealing How Posts are Shared over Time that topic maps could be used to find the leaker’s of information about the killing of Osama ben Laden.

I did not mean to leave the impression that topic maps can only be used to find leakers. Topic maps can be used to find people with access to information not commonly available. Or to find people inclined to share such information. Or the reasons they might share it. Or those around them who might share it.

Leaked information, to be valuable, often must be matched with other information, from other sources. All of which is as much human judgement as the development of sources of information. Nary a drop of logic in any of it.

And the hunting of leakers isn’t a matter of deduction or formal logic either. I really don’t buy the analysis that Peirce would have said in the Wikileaks case: “Quick! Look for someone with a Lady GaGa CD!” (I will run down a reference to Peirce’s retelling of his racist account of tracking down stolen articles. It involves a great deal of luck and racism, not so much formal logic.)

How you choose to use topic maps, as a technology, is entirely up to you.

November 4, 2011

Big Data : Case Studies, Best Practices and Why America should care

Filed under: BigData,Jobs,Marketing — Patrick Durusau @ 6:10 pm

Big Data : Case Studies, Best Practices and Why America should care by Themos Kalafatis.

From the post:

We know that Knowledge is Power. Due to Data Explosion more Data Scientists will be needed and being a Data Scientist becomes increasingly a “cool” profession. Needless to say that America should be preparing for the increased need for Predictive Analytics professionals in Research and Businesses.

Being able to collect, analyze and extract knowledge from a huge amount of Data is not only about Businesses being able to make the right decisions but also critical for a Country as a whole. The more efficient and fast this cycle is, the better for the Country that puts Analytics to work.

This Blog post is actually about the words and phrases being used for this post : All words and phrases on the title of the post (and the introductory text) were carefully selected to produce specific thoughts which can be broken down in three parts :

  • Being a Data Scientist has high value.
  • “Case Studies” and “Best Practices” communicate to readers successful applications and knowledge worthwhile reading.
  • “America should”. This phrase obviously creates specific emotions and feelings to Americans.

Being a “cool” profession or even a member of a “cool” profession doesn’t guarantee good results. Whatever tools you are using, good analytical skills have to lie behind their use. I think topic maps have a role to play in managing “big data” and being a tool that is reached for early and often.

October 28, 2011

Teradata Provides the Simplest Way to Bring the Science of Data to the Art of Business

Filed under: Hadoop,MapReduce,Marketing — Patrick Durusau @ 3:13 pm

Teradata Provides the Simplest Way to Bring the Science of Data to the Art of Business

From the post:

SAN CARLOS, California Teradata (NYSE: TDC), the analytic data solutions company, today announced the new Teradata Aster MapReduce Platform that will speed adoption of big data analytics. Big data analytics can be a valuable tool for increasing corporate profitability by unlocking information that can be used for everything from optimizing digital marketing or detecting fraud to measurement and reporting machine operations in remote locations. However, until now, the cost of mining large volumes of multi-structured data and a widespread scarcity of staff with the required specialized analytical skills have largely prevented adoption of big data analytics.

The new Teradata Aster MapReduce Platform marries MapReduce, the language of big data analytics, with Structured Query Language (SQL), the language of business analytics. It includes Aster Database 5.0, a new Aster MapReduce Appliance—which extends the Aster software deployment options beyond software-only and Cloud—and the Teradata-Aster Adaptor for high-speed data transfer between Teradata and Aster Data systems.

I leave the evaluation of these products to one side for now to draw your attention to:

Teradata Aster makes it easy for any business person to see, explore, and understand multi-structured data. No longer is big data analysis just in the hands of the few data scientists or MapReduce specialists in an organization. (enphasis added)

I am not arguing that is true or even a useful idea, but consider the impact it is going to have on the average business executive. A good marketing move, if not very good for the customers who buy into it. Perhaps there is a kernel of truth we can tap into for marketing topic maps.

October 27, 2011

Department of Numbers

Filed under: Data Source,Marketing — Patrick Durusau @ 4:45 pm

Department of Numbers

From the webpage:

The Department of Numbers contextualizes public data so that individuals can form independent opinions on everyday social and economic matters.

Possible source for both data and analysis that is of public interest. Thinking it will be easier to attract attention to topic maps that address current issues.

October 15, 2011

Code For America

Filed under: eGov,Government Data,Marketing — Patrick Durusau @ 4:27 pm

Code For America

I hesitated over this post. But, being willing to promote topic maps for governments, near-governments, governments in the wings, wannabe governments and groups of various kinds opposed by governments, I should not stick at nationalistic or idealistic groups in the United States.

Projects that will benefit from topic maps in government circles work as well in Boston as Mogadishu and Kandahar.

With some adaptation for local goals and priorities but the underlying technical principles remain the same.

At 9/11, the siloed emergency responders could not effectively communicate with each other. Care to guess who can’t effectively communicate with each other in most major metropolitan areas? Just one example of the siloed nature of state, local and city government (To use U.S.-centric terminology. Supply your own local terminology.)

Keep an eye out for the software that is open sourced as a result of this project. Maybe adaptable to your local circumstances or silo. Or you may need a topic map.

October 14, 2011

Sexier, smarter, faster Information architecture with topic Maps

Filed under: Marketing,Topic Maps — Patrick Durusau @ 6:22 pm

Sexier, smarter, faster Information architecture with topic Maps

A bit dated now (4 years old) by Alexander Johannesen, but you can’t argue with the title. 😉

I wonder if that is like “faster, better, cheaper,” where you can have any one of those?

So you have to pick for your topic map sexier, smarter or faster?

Or has Alexander found a way to get all three?

You can find out what makes Alexander excited (not that I was curious on that score), as well as the basic concepts of topic maps.

Ignore the comment about slides 107-120, all the slides display.

October 13, 2011

Predicting What People Want

Filed under: Interface Research/Design,Marketing — Patrick Durusau @ 6:56 pm

If you haven’t see Steve Yegge’s rant about Google, which was meant to be internal for Google, you can read about it (with the full text) at: Google Engineer Accidentally Posts Rant About Google+.

Todd Wasserman reports:

Yegge went on to write, “Our Google+ team took a look at the aftermarket and said: ‘Gosh, it looks like we need some games. Let’s go contract someone to, um, write some games for us.’ Do you begin to see how incredibly wrong that thinking is now? The problem is that we are trying to predict what people want and deliver it for them.” (emphasis added)

That’s doable, the history of marketing in the 20th century has made that very clear. See Selling Blue Elephants.

What doesn’t work is for very talented programmers, academics, do-gooders, etc., to sit around in conference rooms to plan what other people ought to want.

What or should I say who is missing from the conference room?

Oh, yeah, the people we want to use or even pay for the service. Opps!

You may have guessed/predicted where this is going: The same is true for interfaces, computer or otherwise.

Successful interfaces happen by:

  1. Dumb luck
  2. Management/developers decide on presentation/features
  3. Testing with users and management/developers decide on presentation/features
  4. Testing with users and user feedback determines presentation/features

Care to guess which one I suspect Google used? If you picked door #3, you would be correct! (Sorry, no prize.)

True enough, management/developers also being users they won’t be wrong everytime.

Question: Would you go see a doctor who wasn’t wrong everytime?

I never thought I would recommend that anyone read marketing/advertising texts but I guess there is a time and place for everything. I would rather see you doing that than to see more interfaces that hide your hard and valuable work from users.

OK, this is a bit over long, let me summarize the rule for developing both programs (in terms of capabilities) and interfaces (in terms of features):

Don’t predict what people want! Go ask them!

October 8, 2011

…Harnessing Big Data

Filed under: BigData,Marketing — Patrick Durusau @ 8:12 pm

How Government Could Boost Its Performance by Harnessing Big Data by Robert Atkinson, President, Information Technology and Innovation Foundation.

From the post:

  1. Electric power utilities can use data analytics and smart meters to better manage resources and avoid blackouts,
  2. Food inspectors can use data to better track meat and produce safety from farm to fork ,
  3. Public health officials can use health data to detect infectious disease outbreaks,
  4. Regulators can track pharmaceutical and medical device safety and effectiveness through better data analytics,
  5. Police departments can use data analytics to target crime hotspots and prevent crime waves,
  6. Public utilities can use sensors to collect data on water and sewer usage to detect leaks and reduce water consumption,
  7. First responders can use sensors, GPS, cameras and better communication systems to let police and fire fighters better protect citizens when responding to emergencies, and
  8. State departments of transportation can use data to reduce traffic, more efficiently deploy resources, and implement congestion pricing systems

Numbering added for ease of reference.

By the numbers:

  1. Electric power utilities…[investment in smart meters required and blackouts are usually the result of system failure, monitoring demand isn’t going to help].
  2. Food inspectors… [without adequate food inspectors to enforce standards, tracking potentially unhealthy food isn’t all that interesting a problem],
  3. Public health officials… [already use data to detect disease outbreaks, how did you think it happened?],
  4. Regulators can track… [to do what?, medical devices are already tracked],
  5. Police departments… [police officers don’t know the usual crime spots? need to get different police officers],
  6. Public utilities… [only if they have the sensors and the ability to affect repairs],
  7. First responders… [being able to talk to each other would have a higher priority, most still don’t, ten years after 9/11], and
  8. State departments of transportation… [counting cars will reduce their numbers?, I have to tell our local transportation department].

“Big data” is the flavor of the month but it doesn’t improve your credibility to invoke “big data” when there is none to be seen.

Let’s not make the same mistake with semantic identity topics.

October 6, 2011

Who Is Using SharePoint? The Fortune 500 That Is Who

Filed under: Marketing,SharePoint — Patrick Durusau @ 5:31 pm

Who Is Using SharePoint? The Fortune 500 That Is Who

From the post (Beyond Search):

Oh boy! Our content wranglers found another great list and we are excited about it. Once more TopSharePoint.com pools its sources to gather twenty-five “Fortune 500 Companies Using SharePoint.

The title of the post is slightly misleading. From the TopSharePoint.com article:

Below you will find a list of few Fortune 500 companies using SharePoint technology for their public-facing websites. This review is trying to highlight the adoption of SharePoint in the corporate world as well as the customization level these companies accomplished.

So this list is only some of the Fortune 500 companies who are using SharePoint for their public-facing websites. That’s sounds like a much smaller number than the Beyond Search post would imply. Granting that I think search can be improved by any number of technologies, including topic maps, but let’s be correct in how we represent other posts. A list of Fortune 500 companies that use SharePoint would be quite longer than the one listed at TopSharePoint.com.

« Newer PostsOlder Posts »

Powered by WordPress