Apache CouchDB

Apache CouchDB™ lets you access your data where you need it by defining the Couch Replication Protocol that is implemented by a variety of projects and products that span every imaginable computing environment from globally distributed server-clusters, over mobile phones to web browsers. Software that is compatible with the Couch Replication Protocol include: PouchDB, Cloudant, and Couchbase Lite.

AdonisJs

AdonisJs is a true MVC Framework for Node Js. It encapsulate all the boring parts of Web programming and offers you a nice & clean API to work with.

 

Features

  • Routing

    AdonisJs has one of the most advanced routing engine with support of groups, named routes, subdomains, and controller bindings.

  • Ace Commands

    Ace is a fully baked command line tool shipped with Adonis. Not only you can make use of available commands, creating your own commands is just a matter of seconds.

  • Authentication System

    Inbuilt authentication layer to keep your applications secure with Sessions, JWT, Basic Auth and Personal API tokens.

  • ES2015 Controllers

    AdonisJs makes use of ES2015 features. You define your controllers as Classes and controller methods makes use of ES2015 generators

  • Emailer

    Sending emails is so simple and intutive with inbuilt drivers like Mailgun, Mandrill, SMTP and Amazon SES

  • Data Validator

    Validating data should be simple and fun. AdonisJs handmade validator makes it so simple to validate nested data with expressive set of rules.

  • Security Features

    All AdonisJs applications are pre-configured to handle CSRF, CORS, XSS and other Web attacks out of the box.

  • SQL Orm

    AdonisJs ORM ( Lucid ) is an implementation of Active Record. Leave your traditional SQL queries behind and make use of shiny data models.

  • Event Emitter
python

How to retrieve the ID after a MySQL Insert in Python-pymysql

When you have an auto_increment field on a table, and you’re doing a batch import of new data, you often want to insert a row, then insert more rows that have a foreign key referencing the first row. For example, you want to insert a COMPANY entity, then insert a few dozen PERSON entities under that company. Assuming you have no unique and not-null fields on the COMPANY entity (if company_name were unique and not-null, then you could just issue a select immediately after inserting to get its ID), and assuming you want a thread-safe solution (you could just select for the highest ID in the table immediately after inserting, but since MySQL is by default not transaction-safe, another thread could come in and insert another company right after you insert and before your select), you simply need to have mysql_insert_id().

The MySQLdb documentation mentions conn.insert_id(), however this appears to have been deprecated, maybe? This really should be on the cursor object anyway. And behold! There is a lastrowid on it! It’s just a little bit undocumented:

We are using here pymysql client library:

import pymysql

conn = pymysql.connect(
db='student_record',
user='student_record',
passwd='student_record_123456',
host='localhost')
c = conn.cursor()

name='john doe'

dept='information technology'

query="INSERT INTO student(name,dept) VALUES(%s,%s)"
c.execute(query,(name,dept))
conn.commit()
print c.lastrowid  #will print last insert mysql auto_increment id

What is Gearman?

Gearman provides a generic application framework to farm out work to other machines or processes that are better suited to do the work. It allows you to do work in parallel, to load balance processing, and to call functions between languages. It can be used in a variety of applications, from high-availability web sites to the transport of database replication events. In other words, it is the nervous system for how distributed processing communicates. A few strong points about Gearman:

  • Open Source It’s free! (in both meanings of the word) Gearman has an active open source community that is easy to get involved with if you need help or want to contribute. Worried about licensing? Gearman is BSD.
  • Multi-language – There are interfaces for a number of languages, and this list is growing. You also have the option to write heterogeneous applications with clients submitting work in one language and workers performing that work in another.
  • Flexible – You are not tied to any specific design pattern. You can quickly put together distributed applications using any model you choose, one of those options being Map/Reduce.
  • Fast – Gearman has a simple protocol and interface with an optimized, and threaded, server written in C/C++ to minimize your application overhead.
  • Embeddable – Since Gearman is fast and lightweight, it is great for applications of all sizes. It is also easy to introduce into existing applications with minimal overhead.
  • No single point of failure – Gearman can not only help scale systems, but can do it in a fault tolerant way.
  • No limits on message size – Gearman supports single messages up to 4gig in size. Need to do something bigger? No problem Gearman can chunk messages.
  • Worried about scaling? – Don’t worry about it with Gearman. Craig’s List, Tumblr, Yelp, Etsy,… discover what others have known for years.

Its more important,when you are working with big-database,data need to be process parallel.

generatedata.com

generatedata.com

Ever needed custom formatted sample / test data, like, bad? Well, that’s the idea of this script. It’s a free, open source tool written in JavaScript, PHP and MySQL that lets you quickly generate large volumes of custom data in a variety of formats for use in testing software, populating databases, and… so on and so forth.

This site offers an online demo where you’re welcome to tinker around to get a sense of what the script does, what features it offers and how it works. Then, once you’ve whet your appetite, there’s a free, fully functional, GNU-licensed version available for download. Alternatively, if you want to avoid the hassle of setting it up on your own server, you can donate $20 or more to get an account on this site, letting you generate up to 5,000 records at a time (instead of the maximum 100), and let you save your data sets. Click on the Donate tab for more information.