In the spirit of making diffs of rich information easier to parse, SVG images are now viewable and diffable on GitHub!
As always, you can find more details in our help documentation.
CARVIEW |
In the spirit of making diffs of rich information easier to parse, SVG images are now viewable and diffable on GitHub!
As always, you can find more details in our help documentation.
Keep track of all of your issues and pull requests with the new Issues Dashboard and the new Pull Requests Dashboard.
When we rebuilt GitHub Issues earlier this summer, we made it easier to search and filter issues and pull requests in a repository. Now it's time to think bigger: these new dashboards let you manage your work across all of your repositories at once. You can find links to them at the top of your News Feed.
Use them to quickly find issues you've created. Or pull requests that mention your username. Or issues that have been assigned to you. Or go ahead and use any of our custom advanced search filters and create your own often-used search... the sky's the limit.
Update: 2014-09-29 23:10 UTC
We have published an update to the Git Shell tools for GitHub for Windows, which resolves the bash
vulnerabilities CVE-2014-6271, CVE-2014-7169, CVE-2014-7186 and CVE-2014-7187. If you are running GitHub for Windows, we strongly encourage you to upgrade. You can check if you are on the latest version, and upgrade if needed, by opening "Tools" -> "About GitHub for Windows..."
Update: 2014-09-28 17:30 UTC
Two new bash
vulnerabilities, CVE-2014-7186 and CVE-2014-7187, have been discovered. We have now released special patches of GitHub Enterprise using the latest upstream bash
fix for CVE-2014-7186 and CVE-2014-7187. Upgrade instructions have been sent to all GitHub Enterprise customers, and we strongly encourage all customers to upgrade their instance using this latest release. GitHub.com
remains unaffected by this vulnerability.
Update: 2014-09-26 00:22 UTC
Security patches released yesterday for the bash
command vulnerability identified in CVE-2014-6271 turned out to be incomplete, and a new vulnerability, CVE-2014-7169, was identified. We have now released special patches of GitHub Enterprise using the latest upstream bash
fix for CVE-2014-7169. Upgrade instructions have been sent to all GitHub Enterprise customers, and we strongly encourage all customers to upgrade their instance using this latest release. GitHub.com
remains unaffected by this vulnerability.
Update: 2014-09-25 15:45 UTC
GitHub is closely monitoring new developments that indicate the existing bash
patch for CVE-2014-6271 is incomplete. The fix for this new bash
vulnerability is still in progress, but we will be releasing a new patch for GitHub Enterprise once it has been resolved. At this time, we still strongly encourage all GitHub Enterprise customers to update their instances using the patch made available yesterday.
This morning it was disclosed that Stephane Chazelas discovered a critical vulnerability in the GNU bash utility present on the vast majority of Unix and Linux systems. Using this vulnerability, an attacker can force the execution of arbitrary commands on an affected server. While these commands may not run with root privileges, they provide a significant vector for further exploitation of a system.
We have released special patches of GitHub Enterprise to fix this vulnerability, and have provided detailed instructions to all our Enterprise customers on how to upgrade their instance. An immediate upgrade is required.
None of the extensive penetration testing we've performed today has uncovered any vulnerability on GitHub.com
, including git
over SSH. As an added precaution, however, we have patched all systems to ensure the vulnerability is addressed.
The entries are in, the votes are tallied, and we've chosen the winners for our third annual Data Challenge!
Our first place winner is Issue Stats (repository), by @hstove.
Issue Stats tracks the time it takes for your project to close issues or merge pull requests. You can then display this data through a convenient badge in your project's README file or elsewhere. Issue Stats are easy to get started with, easy to understand, and simple to incorporate into your project — be sure to also check out the other analyses and visualization too.
In second place is GitHut (repository), by @littleark.
Moving through the quarters of the calendar year, GitHut compares programming languages by development activity (via active repositories and push volume), collaboration (via forks and issues), social activity (new watchers on GitHub), and the language's age. GitHut makes it easy to compare and contrast languages over many metrics without overwhelming the viewer.
The third place winner is Eigenfaces, by @c-w.
The Eigenfaces project sampled about 8,000 user avatars, after filtering for automatically generated pictures (like Identicons) and other outliers, then used a machine learning technique called principal component analysis to reduce these avatars to the 20 most significant "features". Each "feature" is interpretable as a shape that contributes significant amounts of variance to the entire body of avatars that were sampled.
Congratulations to our three winners! The first place winner receives travel, lodging, and attendance to Presenting Data and Information, a one-day course offered by Edward Tufte this December in San Francisco. Our second and third place winners will receive cash prizes.
Each year we receive entries that raise the bar for quality and exceed our expectations — this year was no exception. In fact, this year we received a record 79 entries from all over the world! We want to extend our sincere thanks to every individual and team that submitted an entry this year. We're extremely gratified by the level of craftsmanship and creativity exhibited by your entries, and humbled by the obvious amount of work involved. Thank you!
We hope you enjoyed checking out this year's winning entries. We can't wait for next year.
If you're a Gmail user who gets GitHub notifications via email, you'll notice that we've added subject-line links to issues and pull requests on notification messages.
You can use these links to more quickly access content on GitHub -- all without having to open your email notifications.
This feature is brought to you using Gmail's Actions in the Inbox.
GitHub has always been about making open source software better, and today we're launching TODO with a number of partners to help large organizations better support the open source community. If your company has an open source program–or is looking to initiate one–we hope you'll join us.
With TODO, we want to talk openly and develop openly to solve the unique challenges of using and building open source technologies within companies of all sizes. We plan to explore topics like what it looks like to release open source projects, how to shift ownership of projects from companies to the community, and how to make sure that open source projects remain healthy and active.
The inaugural members of TODO include Box, Dropbox, Facebook, GitHub, Google, Khan Academy, Square, Stripe, Twitter, and Walmart Labs. You can visit todogroup.org to sign up and learn more.
Our open and free Internet fuels some of the most incredible innovation in history. It provides new opportunities for billions of people to communicate and collaborate, contributes to economic growth across the world, supports a flourishing open source community, and changes the way we live our lives for the better.
GitHub stands in solidarity with our Internet peers in urging all our US-based users, customers, and fans to call, write, or tweet at your local Senator or Congressperson to let them know you oppose the FCC's proposed changes to the net neutrality landscape.
We believe a new Internet "fast lane" that only privileged businesses can buy into threatens freedom of choice for users, and could ultimately harm the efforts of developers building and shipping both open source and commercial software. Without net neutrality, your users could have a very different experience of your software depending on how much Internet providers are paid.
Congress has the power to take real action to ensure the Internet remains an open platform for speech and commerce. For example, when cable television called into question the traditional conflict between physical point-to-point telephone communication and airwave television broadcasts, Congress responded by adding Title VI to the Communications Act.
GitHub believes that with encouragement and education from the broader Internet community, Congress can be motivated to take action once again. In May of this year, we indicated our support of net neutrality by co-signing a letter to the FCC, but we're not there yet.
We think an open and free Internet is a better Internet, and today we’re asking you to join us by telling Congress you agree.
Commits, compare views, and pull requests now highlight individual changed words instead of the entire changed section, making it easier for you to see exactly what’s been added or removed.
And, of course, it works great with split diffs, too:
At GitHub we say, "it's not fully shipped until it's fast." We've talked before about some of the ways we keep our frontend experience speedy, but that's only part of the story. Our MySQL database infrastructure dramatically affects the performance of GitHub.com. Here's a look at how our infrastructure team seamlessly conducted a major MySQL improvement last August and made GitHub even faster.
Last year we moved the bulk of GitHub.com's infrastructure into a new datacenter with world-class hardware and networking. Since MySQL forms the foundation of our backend systems, we expected database performance to benefit tremendously from an improved setup. But creating a brand-new cluster with brand-new hardware in a new datacenter is no small task, so we had to plan and test carefully to ensure a smooth transition.
A major infrastructure change like this requires measurement and metrics gathering every step of the way. After installing base operating systems on our new machines, it was time to test out our new setup with various configurations. To get a realistic test workload, we used tcpdump
to extract SELECT
queries from the old cluster that was serving production and replayed them onto the new cluster.
MySQL tuning is very workload specific, and well-known configuration settings like innodb_buffer_pool_size
often make the most difference in MySQL's performance. But on a major change like this, we wanted to make sure we covered everything, so we took a look at settings like innodb_thread_concurrency
, innodb_io_capacity
, and innodb_buffer_pool_instances
, among others.
We were careful to only make one test configuration change at a time, and to run tests for at least 12 hours. We looked for query response time changes, stalls in queries per second, and signs of reduced concurrency. We observed the output of SHOW ENGINE INNODB STATUS
, particularly the SEMAPHORES
section, which provides information on work load contention.
Once we were relatively comfortable with configuration settings, we started migrating one of our largest tables onto an isolated cluster. This served as an early test of the process, gave us more space in the buffer pools of our core cluster and provided greater flexibility for failover and storage. This initial migration introduced an interesting application challenge, as we had to make sure we could maintain multiple connections and direct queries to the correct cluster.
In addition to all our raw hardware improvements, we also made process and topology improvements: we added delayed replicas, faster and more frequent backups, and more read replica capacity. These were all built out and ready for go-live day.
With millions of people using GitHub.com on a daily basis, we did not want to take any chances with the actual switchover. We came up with a thorough checklist before the transition:
We also planned a maintenance window and announced it on our blog to give our users plenty of notice.
At 5am Pacific Time on a Saturday, the migration team assembled online in chat and the process began:
We put the site in maintenance mode, made an announcement on Twitter, and set out to work through the list above:
13 minutes later, we were able to confirm operations of the new cluster:
Then we flipped GitHub.com out of maintenance mode, and let the world know that we were in the clear.
Lots of up front testing and preparation meant that we kept the work we needed on go-live day to a minimum.
In the weeks following the migration, we closely monitored performance and response times on GitHub.com. We found that our cluster migration cut the average GitHub.com page load time by half and the 99th percentile by two-thirds:
During this process we decided that moving larger tables that mostly store historic data to separate cluster was a good way to free up disk and buffer pool space. This allowed us to leave more resources for our "hot" data, splitting some connection logic to enable the application to query multiple clusters. This proved to be a big win for us and we are working to reuse this pattern.
You can never do too much acceptance and regression testing for your application. Replicating data from the old cluster to the new cluster while running acceptance tests and replaying queries were invaluable for tracing out issues and preventing surprises during the migration.
Large changes to infrastructure like this mean a lot of people need to be involved, so pull requests functioned as our primary point of coordination as a team. We had people all over the world jumping in to help.
Deploy day team map:
This created a workflow where we could open a pull request to try out changes, get real-time feedback, and see commits that fixed regressions or errors -- all without phone calls or face-to-face meetings. When everything has a URL that can provide context, it's easy to involve a diverse range of people and make it simple for them give feedback.
A full year later, we are happy to call this migration a success — MySQL performance and reliability continue to meet our expectations. And as an added bonus, the new cluster enabled us to make further improvements towards greater availability and query response times. I'll be writing more about those improvements here soon.
We've just released some major improvements to our organization audit logs. As an organization admin, you can now see a running list of events as they're generated across your organization, or you can search for specific activities performed by the members of your org. This data provides you with better security insights and gives you the ability to audit account, team, and repository access over time.
The audit log exposes a number of events like repository deletes, billing updates, new member invites, and team creation. You can see the activities of individual team members, along with a map that highlights the location where events originated. Using the new query interface, you can then filter all these events by the action performed, the team member responsible, the date, repository, and location.
For more information on the audit log, check out the documentation.
You'll now start seeing expanded file listings on GitHub that look like this:
The grey text in the paths — in this case, java/com/netflix/
— means those three folders don't contain any other files. Click on the expanded path to save yourself extra clicks and jump directly to the first non-empty directory.
As a reminder, you can also type t
to invoke the File Finder and jump directly to any file you like.
Sometimes, you just want to grab someone's attention when you're finished with some cool code. That's why we've added support for GitHub's @mention feature inside GitHub for Windows. You can now @mention repository collaborators, and when you publish your changes they'll be notified that you'd like them to have a look.
If you already have GitHub for Windows installed, you can update by selecting 'About GitHub for Windows' in the gear menu on the top right. Otherwise, download the latest version from the GitHub for Windows website.
In the summer of 2009, The New York Senate was the first government organization to post code to GitHub, and that fall, Washington DC quickly followed suit. By 2011, cities like Miami, Chicago, and New York; Australian, Canadian, and British government initiatives like GOV.UK; and US Federal agencies like the Federal Communications Commission, General Services Administration, NASA, and Consumer Financial Protection Bureau were all coding in the open as they began to reimagine government for the 21st century.
Fast forward to just last year: The White House Open Data Policy is published as a collaborative, living document, San Francisco laws are now forkable, and government agencies are accepting pull requests from every day developers.
This is all part of a larger trend towards government adopting open source practices and workflows — a trend that spans not only software, but data, and policy as well — and the movement shows no signs of slowing, with government usage on GitHub nearly tripling in the past year, to exceed 10,000 active government users today.
When government works in the open, it acknowledges the idea that government is the world's largest and longest-running open source project. Open data efforts, efforts like the City of Philadelphia's open flu shot spec, release machine-readable data in open, immediately consumable formats, inviting feedback (and corrections) from the general public, and fundamentally exposing who made what change when, a necessary check on democracy.
Unlike the private sector, however, where open sourcing the "secret sauce" may hurt the bottom line, with government, we're all on the same team. With the exception of say, football, Illinois and Wisconsin don't compete with one another, nor are the types of challenges they face unique. Shared code prevents reinventing the wheel and helps taxpayer dollars go further, with efforts like the White House's recently released Digital Services Playbook, an effort which invites every day citizens to play a role in making government better, one commit at a time.
However, not all government code is open source. We see that adopting these open source workflows for open collaboration within an agency (or with outside contractors) similarly breaks down bureaucratic walls, and gives like-minded teams the opportunity to work together on common challenges.
It's hard to believe that what started with a single repository just five years ago, has blossomed into a movement where today, more than 10,000 government employees use GitHub to collaborate on code, data, and policy each day.
Those 10,000 active users make up nearly 500 government organizations, from more than 50 countries:
Government code on GitHub spans more than 7,500 repositories with @alphagov, @NCIP, @GSA, and @ministryofjustice being the top open source contributors with more than 100 public repositories each:
You can learn more about GitHub in government at government.github.com, and if you're a government employee, be sure to join our semi-private peer group to learn best practices for collaborating on software, data, and policy in the open.
Happy collaborative governing!
So many of us here at GitHub have benefited from early exposure to science, technology, engineering, and mathematics that we're always looking for ways to help young people develop a genuine interest in technical fields.
We can't think of a better (or more fun) way to help inspire a life-long love of science than to encourage students to experiment with robotics. That's why we're proud to be a sponsor of FIRST (For Inspiration and Recognition of Science and Technology).
Every year, FIRST brings together coaches, industry mentors, and volunteers to help students from all over the world learn by building robots. Its oldest program, the FIRST Robotics Competition (FRC), is geared towards high school students. In 2014, over 50,000 students on more than 2,000 teams participated in FRC.
This year's competition had teams build robots that could transport balls and score goals, with the assistance of a human player. We traveled to St. Louis, MO, for the world championship and made a video about the competition:
GitHub supports FIRST to help give students and teachers first-hand experience with software development tools used in the industry. Individual teams host their sites with GitHub Pages, students collaborate on control and vision code across teams, mentors teach code review, teams release applications for scouting, and at least one team (@iLiteRobotics) has released 3D models of all of their robot parts.
Here are just a few examples of how FIRST teams are using GitHub:
GitHub Pages Sites
Code on GitHub
3D Models
FIRST is a team effort. Coaches, industry mentors, and volunteers are all essential to the continued success of the organization. Here are some ways that you can get involved:
Individuals
Organizations