Записки программиста, обо всем и ни о чем. Но, наверное, больше профессионального.

2016-05-22

dynamic vs static

Попалась на глаза интересная диаграмма

 The following are some charts that compare the number of issues labelled "bug" versus the number of repositories in GitHub for different languages. I also tried removing some noise by just using repositories with some stars, on the assumption that repositories with no stars means that nobody is using them, so nobody will report bugs against them.

In green, in the "advanced" static typed languages corner: Haskell, Scala and F#.
In orange, in the "old and boring" static typed languages corner: Java, C++ and Go.
In red, in the dynamic typed language corner: JavaScript, Ruby, Python, Clojure and Erlang


http://labs.ig.com/static-typing-promise

Автор пытается разобраться в вопросе, правда ли, что статическая типизация помогает писать безбажные программы? И в конце делает вывод, что, похоже, нет, не помогает.

Лично я, глядя на эту диаграмму, вижу зависимость (корреляцию) количества багов от "элитарности" языка.
Упрощая, можно сказать, что языками из верхней, безбажной половины списка пользуются программеры с изрядным количеством опыта и мозгов. Очевидно, такие спецы пишут сразу практически без ошибок.
В связи с этим, печально видеть, что любимый Питончик находится в нижней части списка. Превратился в ширпотреб.

Ну а если рассуждать о пользе статической типизации, то в исследование надо добавить много дополнительного материала. Например, количество времени, которое тратит средний программист на написание и отладку некоего блока кода.





original post http://vasnake.blogspot.com/2016/05/dynamic-vs-static.html

2016-05-19

What are you doing? Teaching

“Hello, Professor McGonagall,” said Moody calmly, bouncing the ferret still higher.
“What — what are you doing?” said Professor McGonagall, her eyes following the bouncing ferret’s progress through the air.
“Teaching,” said Moody.
“Teach — Moody, is that a student?” shrieked Professor McGonagall, the books spilling out of her arms.
“Yep,” said Moody.

Совершенно изумительно это звучит в исполнении Стивена Фрая (аудиокнига Harry Potter and the Goblet of Fire, Book 4). Как я с утра это услышал, так на полдня прекрасное настроение.

В кино не совсем так
https://youtu.be/HobPkv5TELA?t=24s





original post http://vasnake.blogspot.com/2016/05/what-are-you-doing-teaching.html

To ensure the security and continuing stability

Бугагашеньки

Для обеспечения стабильности и безопасности ...

нарекаю себя императором

Это из
Star Wars Episode III - Revenge of the Sith (2005)
речь Путина Палпатина темного лорда в сенате.
Смотреть с 33 секунды


https://youtu.be/dO1QifRr3J0?t=33s

ЧСХ, несмотря на то, что это во всех учебниках обсосано, не в коня корм.

Кстати, Звездные Войны очень наглядно показывают, как, с помощью нехитрых интриг, захватить власть в галактике.



original post http://vasnake.blogspot.com/2016/05/to-ensure-security-and-continuing.html

2016-05-16

Чем отличается побитый молью программист

К вопросу о том, чем отличается побитый молью программист от зеленого новичка:

Здесь приводится 42 рекомендации по программированию, которые помогут избежать множества ошибок, сэкономить время и нервы. Автором рекомендаций выступает Андрей Карпов - технический директор компании "СиПроВер", разрабатывающей статический анализатор кода PVS-Studio. За свою практику он насмотрелся на огромное количество способов отстрелить себе ногу; ему явно есть, о чем поведать читателю. Каждая рекомендация сопровождается практическим примером, что подтверждает актуальность поднятого вопроса. Советы ориентированы на C/C++ программистов
http://www.viva64.com/ru/b/0391/

Опытный воин уже видел все эти грабли, многие из них оставили на его лбу памятные знаки.
Поэтому нет нужды объяснять синьору, что программы надо писать проще и понятнее, для людей. Чтобы люди их читали легко и непринужденно.
Чем меньше кода (привет функциональщикам) и чем он понятнее, тем легче избежать всяких разных ошибок.
Ошибки были, есть и будут. Надо стремиться к тому, чтобы находить и устранять их как можно раньше.




original post http://vasnake.blogspot.com/2016/05/blog-post.html

2016-05-12

Exercises

Упражнения и задачки к курсу
Algorithms, Part II. Princeton University, Robert Sedgewick

Algorithms, Part II. Princeton University, Robert Sedgewick

Краткое изложение 7-недельного курса

Очень полезный и относительно тяжелый курс для wannabe программеров:
Algorithms, Part II by Robert Sedgewick
Algorithms, 4th edition textbook libraries https://github.com/kevin-wayne/algs4

Допы
-- Python implementations of selected Princeton Java Algorithms and Clients by Robert Sedgewick and Kevin Wayne
-- Scala translations of Robert Sedgewick's Java Algorthms

Про первую часть курса ищите здесь http://vasnake.blogspot.com/2016/03/algorithms-part-i-princeton-university.html

This course covers the essential information that every serious programmer needs to know about algorithms and data structures, with emphasis on applications and scientific performance analysis of Java implementations.
Part II covers graph-processing algorithms, including minimum spanning tree and shortest paths algorithms,
and string processing algorithms, including string sorts, tries, substring search, regular expressions, and data compression,
and concludes with an overview placing the contents of the course in a larger context.

Курс тяжелый по двум причинам:
– Роберт, прямо скажу, хреново читает лекции. Лично на меня его голос действует как мощное снотворное.
Несколько бодрее дело идет, если скорость воспроизведения поставить 1.25.
Мало того, некоторые важные моменты он оставляет для самостоятельного изучения, типа, читайте мою книгу.
Придется читать, книга полезная.
– Трудоемкие упражнения; непростые задачки (особенно доставляет система тестов). Надо долго думать и изучать доп. материалы.

Поэтому времени уходит масса. Но оно того стоит.

За 6 недель изучено:

Chapter 4: Graphs surveys the most important graph processing problems, including depth-first search, breadth-first search, minimum spanning trees, and shortest paths.
Chapter 5: Strings investigates specialized algorithms for string processing, including radix sorting, substring search, tries, regular expressions, and data compression.
И прочее.

А именно:

Week 1

Lecture: Undirected Graphs. We define an undirected graph API and consider the adjacency-matrix and adjacency-lists representations. We introduce two classic algorithms for searching a graph—depth-first search and breadth-first search. We also consider the problem of computing connected components and conclude with related problems and applications.

Lecture: Directed Graphs. In this lecture we study directed graphs. We begin with depth-first search and breadth-first search in digraphs and describe applications ranging from garbage collection to web crawling. Next, we introduce a depth-first search based algorithm for computing the topological order of an acyclic digraph. Finally, we implement the Kosaraju-Sharir algorithm for computing the strong components of a digraph.

Week 2

Lecture: Minimum Spanning Trees. In this lecture we study the minimum spanning tree problem. We begin by considering a generic greedy algorithm for the problem. Next, we consider and implement two classic algorithm for the problem—Kruskal's algorithm and Prim's algorithm. We conclude with some applications and open problems.

Lecture: Shortest Paths. In this lecture we study shortest-paths problems. We begin by analyzing some basic properties of shortest paths and a generic algorithm for the problem. We introduce and analyze Dijkstra's algorithm for shortest-paths problems with nonnegative weights. Next, we consider an even faster algorithm for DAGs, which works even if the weights are negative. We conclude with the Bellman-Ford-Moore algorithm for edge-weighted digraphs with no negative cycles. We also consider applications ranging from content-aware fill to arbitrage.

Week 3

Lecture: Maximum Flow and Minimum Cut. In this lecture we introduce the maximum flow and minimum cut problems. We begin with the Ford-Fulkerson algorithm. To analyze its correctness, we establish the maxflow-mincut theorem. Next, we consider an efficient implementation of the Ford-Fulkerson algorithm, using the shortest augmenting path rule. Finally, we consider applications, including bipartite matching and baseball elimination.

Lecture: Radix Sorts. In this lecture we consider specialized sorting algorithms for strings and related objects. We begin with a subroutine (key-indexed counting) to sort integers in a small range. We then consider two classic radix sorting algorithms—LSD and MSD radix sorts. Next, we consider an especially efficient variant, which is a hybrid of MSD radix sort and quicksort known as 3-way radix quicksort. We conclude with suffix sorting and related applications.

Week 5

Lecture: Tries. In this lecture we consider specialized algorithms for symbol tables with string keys. Our goal is a data structure that is as fast as hashing and even more flexible than binary search trees. We begin with multiway tries; next we consider ternary search tries. Finally, we consider character-based operations, including prefix match and longest prefix, and related applications.

Lecture: Substring Search. In this lecture we consider algorithms for searching for a substring in a piece of text. We begin with a brute-force algorithm, whose running time is quadratic in the worst case. Next, we consider the ingenious Knuth-Morris-Pratt algorithm whose running time is guaranteed to be linear in the worst case. Then, we introduce the Boyer-Moore algorithm, whose running time is sublinear on typical inputs. Finally, we consider the Rabin-Karp fingerprint algorithm, which uses hashing in a clever way to solve the substring search and related problems.

Week 6

Lecture: Regular Expressions. A regular expression is a method for specifying a set of strings. Our topic for this lecture is the famous grep algorithm that determines whether a given text contains any substring from the set. We examine an efficient implementation that makes use of our digraph reachability implementation from Week 1.

Lecture: Data Compression. We study and implement several classic data compression schemes, including run-length coding, Huffman compression, and LZW compression. We develop efficient implementations from first principles using a Java library for manipulating binary data that we developed for this purpose, based on priority queue and symbol table implementations from earlier lectures.

Week 7

Lecture: Reductions. In this lecture our goal is to develop ways to classify problems according to their computational requirements. We introduce the concept of reduction as a technique for studying the relationship among problems. People use reductions to design algorithms, establish lower bounds, and classify problems in terms of their computational requirements.

Lecture (optional): Linear Programming. The quintessential problem-solving model is known as linear programming, and the simplex method for solving it is one of the most widely used algorithms. In this lecture, we given an overview of this central topic in operations research and describe its relationship to algorithms that we have considered.

Lecture: Intractability. Is there a universal problem-solving model to which all problems that we would like to solve reduce and for which we know an efficient algorithm? You may be surprised to learn that we do no know the answer to this question. In this lecture we introduce the complexity classes P, NP, and NP-complete, pose the famous P=NP question, and consider implications in the context of algorithms that we have treated in this course.


Архив блога

Ярлыки

linux (241) python (191) citation (186) web-develop (170) gov.ru (159) video (124) бытовуха (115) sysadm (100) GIS (97) Zope(Plone) (88) бурчалки (84) Book (83) programming (82) грабли (77) Fun (76) development (73) windsurfing (72) Microsoft (64) hiload (62) internet provider (57) opensource (57) security (57) опыт (55) movie (52) Wisdom (51) ML (47) driving (45) hardware (45) language (45) money (42) JS (41) curse (40) bigdata (39) DBMS (38) ArcGIS (34) history (31) PDA (30) howto (30) holyday (29) Google (27) Oracle (27) tourism (27) virtbox (27) health (26) vacation (24) AI (23) Autodesk (23) SQL (23) humor (23) Java (22) knowledge (22) translate (20) CSS (19) cheatsheet (19) hack (19) Apache (16) Klaipeda (15) Manager (15) web-browser (15) Никонов (15) functional programming (14) happiness (14) music (14) todo (14) PHP (13) course (13) scala (13) weapon (13) HTTP. Apache (12) SSH (12) frameworks (12) hero (12) im (12) settings (12) HTML (11) SciTE (11) USA (11) crypto (11) game (11) map (11) HTTPD (9) ODF (9) Photo (9) купи/продай (9) benchmark (8) documentation (8) 3D (7) CS (7) DNS (7) NoSQL (7) cloud (7) django (7) gun (7) matroska (7) telephony (7) Microsoft Office (6) VCS (6) bluetooth (6) pidgin (6) proxy (6) Donald Knuth (5) ETL (5) NVIDIA (5) Palanga (5) REST (5) bash (5) flash (5) keyboard (5) price (5) samba (5) CGI (4) LISP (4) RoR (4) cache (4) car (4) display (4) holywar (4) nginx (4) pistol (4) spark (4) xml (4) Лебедев (4) IDE (3) IE8 (3) J2EE (3) NTFS (3) RDP (3) holiday (3) mount (3) Гоблин (3) кухня (3) урюк (3) AMQP (2) ERP (2) IE7 (2) NAS (2) Naudoc (2) PDF (2) address (2) air (2) british (2) coffee (2) fitness (2) font (2) ftp (2) fuckup (2) messaging (2) notify (2) sharepoint (2) ssl/tls (2) stardict (2) tests (2) tunnel (2) udev (2) APT (1) Baltic (1) CRUD (1) Canyonlands (1) Cyprus (1) DVDShrink (1) Jabber (1) K9Copy (1) Matlab (1) Portugal (1) VBA (1) WD My Book (1) autoit (1) bike (1) cannabis (1) chat (1) concurrent (1) dbf (1) ext4 (1) idioten (1) join (1) krusader (1) license (1) life (1) migration (1) mindmap (1) navitel (1) pneumatic weapon (1) quiz (1) regexp (1) robot (1) science (1) seaside (1) serialization (1) shore (1) spatial (1) tie (1) vim (1) Науру (1) крысы (1) налоги (1) пианино (1)