Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

November 17, 2016

Operating Systems Design and Implementation (12th USENIX Symposium)

Filed under: Computer Science,CS Lectures,Cybersecurity,Security — Patrick Durusau @ 9:59 pm

Operating Systems Design and Implementation (12th USENIX Symposium) – Savannah, GA, USA, November 2-4, 2016.

Message from the OSDI ’16 Program Co-Chairs:

We are delighted to welcome to you to the 12th USENIX Symposium on Operating Systems Design and Implementation, held in Savannah, GA, USA! This year’s program includes a record high 47 papers that represent the strength of our community and cover a wide range of topics, including security, cloud computing, transaction support, storage, networking, formal verification of systems, graph processing, system support for machine learning, programming languages, troubleshooting, and operating systems design and implementation.

Weighing in at seven hundred and ninety-seven (797) pages, this tome will prove more than sufficient to avoid annual family arguments during the holiday season.

Not to mention this is an opportunity to hone your skills to a fine edge.

May 4, 2015

Notes on Theory of Distributed Systems

Filed under: CS Lectures,Distributed Computing — Patrick Durusau @ 8:06 pm

Notes on Theory of Distributed Systems by James Aspnes.

From the preface:

These are notes for the Spring 2014 semester version of the Yale course CPSC 465/565 Theory of Distributed Systems. This document also incorporates the lecture schedule and assignments, as well as some sample assignments from previous semesters. Because this is a work in progress, it will be updated frequently over the course of the semester.

Notes from Fall 2011 can be found at http://www.cs.yale.edu/homes/aspnes/classes/469/notes-2011.pdf.

Notes from earlier semesters can be found at http://pine.cs.yale.edu/pinewiki/465/.

Much of the structure of the course follows the textbook, Attiya and Welch’s Distributed Computing [AW04], with some topics based on Lynch’s Distributed Algorithms [Lyn96] and additional readings from the research literature. In most cases you’ll find these materials contain much more detail than what is presented here, so it is better to consider this document a supplement to them than to treat it as your primary source of information.

When something exceeds three hundred (> 300) pages, I have trouble calling it “notes.” 😉

A treasure trove of information on distributed computing.

I first saw this in a tweet by Henry Robinson.

March 31, 2015

Machine Learning – Ng – Self-Paced

Filed under: CS Lectures,Machine Learning — Patrick Durusau @ 2:03 pm

Machine Learning – Ng – Self-Paced

If you need a self-paced machine learning course, consider your wish as granted!

From the description:

Machine learning is the science of getting computers to act without being explicitly programmed. In the past decade, machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it. Many researchers also think it is the best way to make progress towards human-level AI. In this class, you will learn about the most effective machine learning techniques, and gain practice implementing them and getting them to work for yourself. More importantly, you’ll learn about not only the theoretical underpinnings of learning, but also gain the practical know-how needed to quickly and powerfully apply these techniques to new problems. Finally, you’ll learn about some of Silicon Valley’s best practices in innovation as it pertains to machine learning and AI. This course provides a broad introduction to machine learning, datamining, and statistical pattern recognition. Topics include: (i) Supervised learning (parametric/non-parametric algorithms, support vector machines, kernels, neural networks). (ii) Unsupervised learning (clustering, dimensionality reduction, recommender systems, deep learning). (iii) Best practices in machine learning (bias/variance theory; innovation process in machine learning and AI). The course will also draw from numerous case studies and applications, so that you’ll also learn how to apply learning algorithms to building smart robots (perception, control), text understanding (web search, anti-spam), computer vision, medical informatics, audio, database mining, and other areas.

Great if your schedule/commitments varies from week to week, take the classes at your own pace!

Same great content that has made this course such a winner for Coursera.

I first saw this in a tweet by Tryolabs.

March 29, 2015

The Theory of Relational Databases

Filed under: CS Lectures,Database — Patrick Durusau @ 5:02 pm

The Theory of Relational Databases by David Maier.

From the webpage:

This text has been long out of print, but I still get requests for it. The copyright has reverted to me, and you have permission to reproduce it for personal or academic use, but not for-profit purposed. Please include “Copyright 1983 David Maier, used with permission” on anything you distribute.

Out of date, 1983, if you are looking for the latest work but not if you are interested in where we have been. Sometimes the later is more important than the former.

Enjoy!

March 23, 2015

From Nand to Tetris / Part I [“Not for everybody.”]

Filed under: CS Lectures,Cybersecurity,Education — Patrick Durusau @ 1:06 pm

From Nand to Tetris / Part I April 11 – June 7 2015

From the webpage:

Build a modern computer system, starting from first principles. The course consists of six weekly hands-on projects that take you from constructing elementary logic gates all the way to building a fully functioning general purpose computer. In the process, you will learn — in the most direct and intimate way — how computers work, and how they are designed.

This course is a fascinating 7-week voyage of discovery in which you will go all the way from Boolean algebra and elementary logic gates to building a central processing unit, a memory system, and a hardware platform, leading up to a general-purpose computer that can run any program that you fancy. In the process of building this computer you will become familiar with many important hardware abstractions, and you will implement them, hands on. But most of all, you will enjoy the tremendous thrill of building a complex and useful system from the ground up.

You will build all the hardware modules on your home computer, using a Hardware Description Language (HDL), learned in the course, and a hardware simulator, supplied by us. A hardware simulator is a software system that enables building and simulating gates and chips before actually committing them to silicon. This is exactly what hardware engineers do in practice: they build and test computers in simulation, using HDL and hardware simulators.

Do you trust locks?

PadLock

Do you know how locks work?

I don’t and yet I trust locks to work. But then a lock requires physical presence to be opened and locks do have a history of defeating attempts to unlock them without the key. Not always but a high percentage of the time.

Do you trust computers?

Analog_Computing_Machine_GPN-2000-000354

Do you know how computers work?

I don’t, not really. Not at the level of silicon.

So why would I trust computers? We know computers are as faithful as a napkin at a party and have no history of being secure, for anyone.

Necessity seems like a weak answer doesn’t it? Trusting computers to be insecure seems like a better answer.

Not that everyone wants or needs to delve into computers at the level of silicon but exposure to the topic doesn’t hurt.

Might even help when you hear of hardware hacks like rowhammer. You don’t really think that is the last of the hardware hacks do you? Seriously?

BTW, I first read about this course in the Clojure Gazette, which is a great read, whether you are a Clojure programmer or not. Take a look and consider subscribing. Another reason to subscribe is that it lists a smail address of New Orleans, Louisiana.

Even the fast food places have good food in New Orleans. The non-fast food has to be experienced. Words are not enough. It would be like trying to describe sex to someone who has only read about it. Just not the same. Every conference should be in New Orleans every two or three years.

After you get through day-dreaming about New Orleans, go ahead and register for From Nand to Tetris / Part I April 11 – June 7 2015

February 26, 2015

Structure and Interpretation of Computer Programs (LFE Edition)

Filed under: Computer Science,CS Lectures,Erlang,LFE Lisp Flavored Erlang — Patrick Durusau @ 7:40 pm

Structure and Interpretation of Computer Programs (LFE Edition)

From the webpage:

This Gitbook (available here) is a work in progress, converting the MIT classic Structure and Interpretation of Computer Programs to Lisp Flavored Erlang. We are forever indebted to Harold Abelson, Gerald Jay Sussman, and Julie Sussman for their labor of love and intelligence. Needless to say, our gratitude also extends to the MIT press for their generosity in licensing this work as Creative Commons.

Contributing

This is a huge project, and we can use your help! Got an idea? Found a bug? Let us know!.

Writing, or re-writing if you are transposing a CS classic into another language, is far harder than most people imagine. Probably even more difficult than the original because your range of creativity is bound by the organization and themes of the underlying text.

I may have some cycles to donate to proof reading. Anyone else?

December 6, 2014

The Caltech-JPL Summer School on Big Data Analytics

Filed under: BigData,CS Lectures — Patrick Durusau @ 8:04 am

The Caltech-JPL Summer School on Big Data Analytics

From the webpage:

This is not a class as it is commonly understood; it is the set of materials from a summer school offered by Caltech and JPL, in the sense used by most scientists: an intensive period of learning of some advanced topics, not on an introductory level.

The school will cover a variety of topics, with a focus on practical computing applications in research: the skills needed for a computational (“big data”) science, not computer science. The specific focus will be on applications in astrophysics, earth science (e.g., climate science) and other areas of space science, but with an emphasis on the general tools, methods, and skills that would apply across other domains as well. It is aimed at an audience of practicing researchers who already have a strong background in computation and data analysis. The lecturers include computational science and technology experts from Caltech and JPL.

Students can evaluate their own progress, but there will be no tests, exams, and no formal credit or certificates will be offered.

Syllabus:

  1. Introduction to the school. Software architectures. Introduction to Machine Learning.
  2. Best programming practices. Information retrieval.
  3. Introduction to R. Markov Chain Monte Carlo.
  4. Statistical resampling and inference.
  5. Databases.
  6. Data visualization.
  7. Clustering and classification.
  8. Decision trees and random forests.
  9. Dimensionality reduction. Closing remarks.

If this sounds challenging, imagine doing it in nine (9) days!

The real advantage of intensive courses is you are not trying to juggle work/study/eldercare and other duties while taking the course. That alone may account for some of the benefits of intensive courses, the opportunity to focus on one task and that task alone.

I first saw this in a tweet by Gregory Piatetsky.

July 18, 2014

Artificial Intelligence | Natural Language Processing

Filed under: Artificial Intelligence,CS Lectures,Natural Language Processing — Patrick Durusau @ 4:26 pm

Artificial Intelligence | Natural Language Processing by Christopher Manning.

From the webpage:

This course is designed to introduce students to the fundamental concepts and ideas in natural language processing (NLP), and to get them up to speed with current research in the area. It develops an in-depth understanding of both the algorithms available for the processing of linguistic information and the underlying computational properties of natural languages. Wordlevel, syntactic, and semantic processing from both a linguistic and an algorithmic perspective are considered. The focus is on modern quantitative techniques in NLP: using large corpora, statistical models for acquisition, disambiguation, and parsing. Also, it examines and constructs representative systems.

Lectures with notes.

If you are new to natural language processing, it would be hard to point at a better starting point.

Enjoy!

May 2, 2014

Find Papers We Love

Filed under: Computation,Computer Science,CS Lectures — Patrick Durusau @ 7:29 pm

Find Papers We Love by Zachary Tong.

A search interface to the Github repository maintained by @Papers_we_love.

It’s not “big data” but this search interface is going to make my life better.

You?

PS: Papers We Love

From their homepage:

What was the last paper within the realm of computing you read and loved? What did it inspire you to build or tinker with? Come share the ideas in an awesome academic/research paper with fellow engineers, programmers, and paper-readers. Lead a session and show off code that you wrote that implements these ideas or just give us the lowdown about the paper (because of HARD MATH!). Otherwise, just come, listen, and discuss.

We’re curating a repository for papers and places-to-find papers. You can contribute by adding PR’s for papers, code, and/or links to other repositories.

We’re posting videos of all our presentations, from all our chapters.

This is a productive use of the Internet.

April 26, 2014

From Geek to Clojure!

Filed under: Clojure,CS Lectures,Programming — Patrick Durusau @ 3:03 pm

From Geek to Clojure! by Nada Amin and William Byrd.

From the description:

In his Lambda Jam keynote, “Everything I Have Learned I Have Learned From Someone Else,” David Nolen exposed the joys and benefits of reading academic papers and putting them to work. In this talk, we show how to translate the mathy figures in Computer Science papers into Clojure code using both core.match and core.logic. You’ll gain strategies for understanding concepts in academic papers by implementing them!

Nada Amin is a member of the Scala team at EPFL, where she studies type systems and hacks on programming languages. She has contributed to Clojure’s core.logic and Google’s Closure compiler. She’s loved helping others learn to program ever since tutoring SICP as an undergraduate lab assistant at MIT.

William E. Byrd is a Postdoctoral Researcher in the School of Computing at the University of Utah. He is co-author of The Reasoned Schemer, and co-designer of several declarative languages: miniKanren (logic programing), Harlan (GPU programming), and Kanor (cluster programming). His StarCraft 2 handle is ‘Rojex’ (character code 715).

An alternative title for this paper would be: How To Read An Academic CS Paper. Seriously.

From Geek to Clojure at Github has the slides and “Logical types for untyped languages” (mentioned near the end of the paper).

I don’t think you need a login at the ACM Digital Library to see who cites “Logical types for untyped languages.

Some other resources of interest:

Logical Types for Untyped Languages by Sam Tobin-Hochstadt (speaker deck)

Logical Types for Untyped Languages by Sam Tobin-Hochstadt and Matthias Felleisen (video)

A series of videos by Nada Amin and William Byrd that makes fewer assumptions about the audience on reading CS papers would really rock!

March 8, 2014

papers-we-love

Filed under: Computer Science,CS Lectures — Patrick Durusau @ 8:20 pm

papers-we-love

From the webpage:

Repository related to the following meetups:

Let us know if you are interested in starting a chapter!

A GitHub repository of CS papers.

If you decide to start a virtual “meetup” be sure to ping me. Nothing against the F2F meetings, absolutely needed, but some of use can’t make F2F meetings.

PS: There is also a list of other places to search for good papers.

March 2, 2014

Data Mining with Weka (2014)

Filed under: CS Lectures,Data Mining,Weka — Patrick Durusau @ 9:17 pm

Data Mining with Weka

From the course description:

Everybody talks about Data Mining and Big Data nowadays. Weka is a powerful, yet easy to use tool for machine learning and data mining. This course introduces you to practical data mining.

The 5-week course starts on 3rd March 2014.

Apologies, somehow I missed the notice on this class.

This will be followed by More Data Mining with Weka in late April of 2014.

Based on my experience with the Weka Machine Learning course, also with Professor Witten, I recommend either one or both of these courses without reservation.

February 14, 2014

How to write a great research paper

Filed under: CS Lectures — Patrick Durusau @ 6:01 pm

How to write a great research paper: Seven simple suggestions by Simon Peyton Jones.

From the description:

Professor Simon Peyton Jones, Microsoft Research, gives a guest lecture on writing. Seven simple suggestions: don’t wait – write, identify your key idea, tell a story, nail your contributions, put related work at the end, put your readers first, listen to your readers.

A truly amazing presentation on writing a good research paper.

BTW, see Simon’s homepage for articles on Haskell, types, functional programming, etc.

February 12, 2014

Specializations On Coursera

Filed under: CS Lectures,Data Science — Patrick Durusau @ 4:31 pm

Specializations On Coursera

Coursera is offering sequences of courses that result in certificates in particular areas.

For example, John Hopkins is offering a certificate in Data Science, nine courses at $49.00 each or $490 for a specialization certificate.

I first saw this in a post by Stephen Turner, Coursera Specializations: Data Science, Systems Biology, Python Programming.

February 6, 2014

Should Everybody Learn to Code?

Filed under: Computer Science,CS Lectures,Programming — Patrick Durusau @ 7:41 pm

Should Everybody Learn to Code? by Esther Shein.

Interesting essay but most of the suggestions read like this one:

Just as students are taught reading, writing, and the fundamentals of math and the sciences, computer science may one day become a standard part of a K–12 school curriculum. If that happens, there will be significant benefits, observers say. As the kinds of problems we will face in the future will continue to increase in complexity, the systems being built to deal with that complexity will require increasingly sophisticated computational thinking skills, such as abstraction, decomposition, and composition, says Wing.

“If I had a magic wand, we would have some programming in every science, mathematics, and arts class, maybe even in English classes, too,” says Guzdial. “I definitely do not want to see computer science on the side … I would have computer science in every high school available to students as one of their required science or mathematics classes.”

But university CS programs for the most part don’t teach people to code. Rather they teach computer science in the abstract.

Moreover, coding practice isn’t necessary to contribute to computer science, as illustrated in “History of GIS and Early Computer Cartography Project,” by John Hessler, Cartographic Specialist, Geography and Map Division, Library of Congress.

As part of a project to collect early GIS materials, the following was discovered in an archive:

One set of papers in particular, which deserves much more attention from today’s mapmakers, historians, and those interested in the foundations of current geographic thought, is the Harvard Papers in Theoretical Geography. These papers, subtitled, “Geography and the properties of surfaces,” detail the lab’s early experiments in the computer analysis of cartographic problems. They also give insight into the theoretical thinking of many early researchers as they experimented with theorems from algebraic topology, complex spatial analysis algorithms, and various forms of abstract algebras to redefine the map as a mathematical tool for geographic analysis. Reading some of the titles in the series today, for example, “Hyper-surfaces and Geodesic Lines in 4-D Euclidean Space and The Sandwich Theorem: A Basic One for Geography,” gives one a sense of the experimentation and imaginative thinking that surrounded the breakthroughs necessary for the development of our modern computer mapping systems.

And the inspiration for this work?

Aside from the technical aspects that archives like this reveal, they also show deeper connections with cultural and intellectual history. They demonstrate how the practitioners and developers of GIS found themselves compelled to draw both distinctions and parallels with ideas that were appearing in the contemporary scholarly literature on spatial and temporal reasoning. Their explorations into this literature was not limited to geographic ideas on lived human space but also drew on philosophy, cognitive science, pure mathematics, and fields like modal logic—all somehow to come to terms with the diverse phenomena that have spatiotemporal extent and that might be mapped and analyzed.

Coding is a measurable activity but being measurable doesn’t mean it is the only way to teach abstract thinking skills.

The early days of computer science, including compiler research, suggest coding isn’t require to learn abstract thinking skills.

Coding is a useful skill but let’s not confuse a skill or even computer science with abstract thinking skills. Abstract thinking is needed in many domains and we will all profit from not defining it too narrowly.

I first saw this in a tweet from Tim O’Reilly, who credits Simon St. Laurent with the discovery.

February 5, 2014

Proof Theory Foundations

Filed under: CS Lectures,Proof Theory — Patrick Durusau @ 1:02 pm

Frank Pfenning’s lectures from the Oregon Programming Languages School 2012, University of Oregon.

Lecture 1

Lecture 2

Lecture 3

Lecture 4

Unlike the astronomer in Rasselas (Chapter 41), it is insufficient in serious CS discussions to “know” you are correct. 😉

Taught along with Category Theory Foundations and Type Theory Foundations.

Type Theory Foundations

Filed under: CS Lectures,Types — Patrick Durusau @ 11:54 am

Robert Harper’s lectures from the Oregon Programming Languages School 2012, University of Oregon.

Lecture 1

Lecture 2

Lecture 3

Lecture 4

Lecture 5

Lecture 6

If you are going to follow Walter Bright in writing a new computer language, you will need to study types.

Taught along with Category Theory Foundations and Proof Theory Foundations.

Category Theory Foundations

Filed under: Category Theory,CS Lectures — Patrick Durusau @ 11:46 am

Steve Awodey’s lectures from the Oregon Programming Languages School 2012, University of Oregon.

Homework assignments

Lecture 1

Lecture 2

Lecture 3

Lecture 4

I first saw this in a tweet by Jim Duey.

More cold weather is coming and the football (U.S.) is over. 😉

Taught along with Proof Theory Foundations and Type Theory Foundations.

January 3, 2014

ACM Awards

Filed under: CS Lectures — Patrick Durusau @ 10:32 am

While I was writing about Jeff Huang’s Best Paper Awards in Computer Science (2013), which lists “best paper” awards, I thought about the ACM awards for dissertations and other contributions to computer science.

I mostly follow the ACM Doctoral Dissertation Award but you won’t lack for high grade reading material following any of the other award categories.

Dissertation links take you to the author’s entry in the ACM Digital library, not the dissertation in question.

Another idiosyncratic “feature” of the ACM website. Tech support at the ACM opines that some secret group, that members cannot contact directly, is responsible the “features” of the ACM website. Such as not being able to download multiple citations at one time.

I wrote to the editor at CACM about the “features” process. If you haven’t seen that letter in CACM, well, I haven’t either.

Letters critical of the opaque web “features” process don’t rate high on the priority list for publication.

If you have any improvements you want to suggest to the ACM site, please do so. I will be interested in hearing if your experience is different from mine.

January 2, 2014

Best Paper Awards in Computer Science (2013)

Filed under: Conferences,CS Lectures — Patrick Durusau @ 6:08 pm

Best Paper Awards in Computer Science (2013)

Jeff Huang’s list of the best paper awards from 29 CS conferences since 1996 up to and including 2013.

I have updated my listing for the conference abbreviations Jeff uses. That added eight (8) new conferences to the list.

December 15, 2013

Data Science

Filed under: CS Lectures,Programming,Python — Patrick Durusau @ 8:49 pm

Data Science

Lectures on data science from the Harvard Extension School.

Twenty-two (22) lectures and ten (10) labs.

The lab sessions are instructor lead coding exercises with good visibility of the terminal window.

Possibly a format to follow in preparing other CS instructional material.

Lecture following by typing exercise of entering and understanding the code (when typos result in it not working).

I was reminded recently that Hunter Thompson typed novels by Ernest Hemingway and F. Scott Fitzgerald in order to learn their writing styles.

Would the same work for learning programming style? That you would begin to recognize patterns and options?

If nothing else, it give you some quality time with a debugger. 😉

December 7, 2013

Recommender Systems Course from GroupLens

Filed under: CS Lectures,Recommendation — Patrick Durusau @ 5:26 pm

Recommender Systems Course from GroupLens by Danny Bickson.

From the post:

I got the following course link from my colleague Tim Muss. The GroupLens research group (Univ. of Minnesota) have released a coursera course about recommender systems. Michael Konstan and Michael Ekstrand are lecturing. Any reader of my blog which has an elephant memory will recall I wrote about the Lenskit project already 2 years ago where I intreviewed Michael Ekstrand.

Would you agree that recommendation involves subject recognition?

At a minimum recognition of the subject to be recommended and the subject of a particular user’s preference.

I ask because the key to topic map “merging” isn’t ontological correctness but “correctness” in the eyes of a particular user.

What other standard would I use?

December 3, 2013

Annual Christmas Tree Lecture (Knuth)

Filed under: CS Lectures,Graphs,Mathematics,Trees — Patrick Durusau @ 6:23 pm

Computer Musing by Professor Donald E. Knuth.

From the webpage:

Professor Knuth will present his 19th Annual Christmas Tree Lecture on Monday, December 9, 2013 at 7:00 pm in NVIDIA Auditorium in the new Huang Engineering Center, 475 Via Ortega, Stanford University (map). The topic will be Planar Graphs and Ternary Trees. There is no admission charge or registration required. For those unable to come to Stanford, register for the live webinar broadcast.

No doubt heavy sledding but what better way to prepare for the holiday season?

Date: Monday, December 9, 2013

Time:
7 p.m. – 8 p.m. Pacific
10 p.m. – 11 p.m. Eastern

July 22, 2013

NAACL 2013 – Videos!

NAACL 2013

Videos of the presentations at the 2013 Conference of the North American Chapter of the Association for Computational Linguistics.

Along with the papers, you should not lack for something to do over the summer!

June 19, 2013

Symposium on Visions of the Theory of Computing

Filed under: Computer Science,CS Lectures — Patrick Durusau @ 3:08 pm

Symposium on Visions of the Theory of Computing by the Simons Institute for the Theory of Computing.

Description:

May 29-31, 2013, UC Berkeley: This three-day symposium brought together distinguished speakers and participants from the Bay Area and all over the world to celebrate both the excitement of fundamental research on the Theory of Computing, and the accomplishments and promise of computational research in effecting progress in other sciences – the two pillars of the research agenda of the Institute.

I’m not sure if it is YouTube or the people who post videos to YouTube, but customary sorting rules, such as by an author’s last name appear to be lacking.

To facilitate your finding the lecture of your choice, I have sorted the videos by the speaker’s last name.

Should you have the occasion to post a list of videos, papers, presentations, please be courteous to readers who want to scan a list sorted by author/presenter.

It will still appear to be a random ordering to those unfamiliar with that technique. (Or ctrl-f or search engines.) 😉

June 14, 2013

NAACL ATL 2013

2013 Conference of the North American Chapter of the Association for Computational Linguistics

The NAACL conference wraps up tomorrow in Atlanta but in case you are running low on summer reading materials:

Proceedings for the 2013 NAACL and *SEM conferences. Not quite 180MB but close.

Scanning the accepted papers will give you an inkling of what awaits.

Enjoy!

May 10, 2013

Harvard Stat 221

Filed under: CS Lectures,Data Science,Mathematics,Statistics — Patrick Durusau @ 6:36 pm

Harvard Stat 221 “Statistical Computing and Visualization”: by Sergiy Nesterko.

From the post:

Stat 221 is Statistical Computing and Visualization. It’s a graduate class on analyzing data without losing scientific rigor, and communicating your work. Topics span the full cycle of a data-driven project including project setup, design, implementation, and creating interactive user experiences to communicate ideas and results. We covered current theory and philosophy of building models for data, computational methods, and tools such as d3js, parallel computing with MPI, R.

See Sergily’s post for the lecture slides from this course.

May 6, 2013

Virtual School summer courses…

Filed under: BigData,CS Lectures,GPU,Multi-Core — Patrick Durusau @ 7:02 pm

Virtual School summer courses on data-intensive and many-core computing

From the webpage:

Graduate students, post-docs and professionals from academia, government, and industry are invited to sign up now for two summer school courses offered by the Virtual School of Computational Science and Engineering.

These Virtual School courses will be delivered to sites nationwide using high-definition videoconferencing technologies, allowing students to participate at a number of convenient locations where they will be able to work with a cohort of fellow computational scientists, have access to local experts, and interact in real time with course instructors.

The Data Intensive Summer School focuses on the skills needed to manage, process, and gain insight from large amounts of data. It targets researchers from the physical, biological, economic, and social sciences who need to deal with large collections of data. The course will cover the nuts and bolts of data-intensive computing, common tools and software, predictive analytics algorithms, data management, and non-relational database models.

(…)

For more information about the Data-Intensive Summer School, including pre-requisites and course topics, visit http://www.vscse.org/summerschool/2013/bigdata.html.

The Proven Algorithmic Techniques for Many-core Processors summer school will present students with the seven most common and crucial algorithm and data optimization techniques to support successful use of GPUs for scientific computing.

Studying many current GPU computing applications, the course instructors have learned that the limits of an application’s scalability are often related to some combination of memory bandwidth saturation, memory contention, imbalanced data distribution, or data structure/algorithm interactions. Successful GPU application developers often adjust their data structures and problem formulation specifically for massive threading and executed their threads leveraging shared on-chip memory resources for bigger impact. The techniques presented in the course can improve performance of applicable kernels by 2-10X in current processors while improving future scalability.

(…)

For more information about the Proven Algorithmic Techniques for Many-core Processors course, including pre-requisites and course topics, visit http://www.vscse.org/summerschool/2013/manycore.html.

Think of it as summer camp. For $100 (waived at some locations), it would be hard to do better.

May 1, 2013

Vote for Web Science MOOC!

Filed under: CS Lectures,WWW — Patrick Durusau @ 9:05 am

Please help me to realize my Web science massive open online course by René Pickhardt.

René has designed a Web Science MOOC but needs your vote at: https://moocfellowship.org/submissions/web-science to get the course funded.

Details on the course are at: Please help me to realize my Web science massive open online course.

The Web is important but to be honest, I am hopeful success here will encourage René to do a MOOC on graphs.

So I have an ulterior motive for promoting this particular MOOC. 😉

April 8, 2013

Spring/Summer Reading – 2013

Filed under: Books,CS Lectures — Patrick Durusau @ 1:05 pm

The ACM has released:

Best Reviews (2012)

and,

Notable Computing Books and Articles of 2012

Before you hit the summer conference or vacation schedule, visit your local bookstore or load up your ebook reader!

I first saw this at Best Reviews & Notable Books and Articles of 2012 by Shar Steed.

Older Posts »

Powered by WordPress