mozilla

Mozilla Nederland LogoDe Nederlandse
Mozilla-gemeenschap

Navrina Singh Joins the Mozilla Foundation Board of Directors

Mozilla Blog - ma, 22/06/2020 - 18:16

Today, I’m excited to welcome Navrina Singh as a new member of the Mozilla Foundation Board of Directors. You can see comments from Navrina here.

Navrina is the Co-Founder of Credo AI, an AI Fund company focused on auditing and governing Machine Learning. She is the former Director of Product Development for Artificial Intelligence at Microsoft. Throughout her career she has focused on many aspects of business including start up ecosystems, diversity and inclusion, development of frontier technologies and products. This breadth of experience is part of the reason she’ll make a great addition to our board.

In early 2020, we began focusing in earnest on expanding Mozilla Foundation’s board. Our recruiting efforts have been geared towards building a diverse group of people who embody the values and mission that bring Mozilla to life and who have the programmatic expertise to help Mozilla, particularly in artificial intelligence.

Since January, we’ve received over 100 recommendations and self-nominations for possible board members. We ran all of the names we received through a desk review process to come up with a shortlist. After extensive conversations, it is clear that Navrina brings the experience, expertise and approach that we seek for the Mozilla Foundation Board.

Prior to working on AI at Microsoft, Navrina spent 12 years at Qualcomm where she held roles across engineering, strategy and product management. In her last role as the head of Qualcomm’s technology incubator ‘ImpaQt’ she worked with early start-ups in machine intelligence. Navrina is a Young Global Leader with the World Economic Forum; and has previously served on the industry advisory board of the University of Wisconsin-Madison College of Engineering; and the boards of Stella Labs, Alliance for Empowerment and the Technology Council for FIRST Robotics.

Navrina has been named as Business Insiders Top Americans changing the world and her work in Responsible AI has been featured in FORTUNE, Geekwire and other publications. For the past decade she has been thinking critically about the way AI and other emerging technologies impact society. This included a non-profit initiative called Marketplace for Ethical and Responsible AI Tools (MERAT) focused on building, testing and deploying AI responsibly. It was through this last bit of work that Navrina was introduced to our work at Mozilla. This experience will help inform Mozilla’s own work in trustworthy AI.

We also emphasized throughout this search a desire for more global representation. And while Navrina is currently based in the US, she has a depth of experience partnering with and building relationships across important markets – including China, India and Japan. I have no doubt that this experience will be an asset to the board. Navrina believes that technology can open doors, offering huge value to education, economies and communities in both the developed and developing worlds.

Please join me in welcoming Navrina Singh to the Mozilla Foundation Board of Directors.

PS. You can read Navrina’s message about why she’s joining Mozilla here.

Background:

Twitter: @navrina_singh

LinkedIn: https://www.linkedin.com/in/navrina/

The post Navrina Singh Joins the Mozilla Foundation Board of Directors appeared first on The Mozilla Blog.

Categorieën: Mozilla-nl planet

Why I’m Joining the Mozilla Board

Mozilla Blog - ma, 22/06/2020 - 17:34

Firefox was my window into Mozilla 15 years ago, and it’s through this window I saw the power of an open and collaborative community driving lasting change. My admiration and excitement for Mozilla was further bolstered in 2018, when Mozilla made key additions to it’s Manifesto to be more explicit around it’s mission to guard the open nature of the internet. For me this addendum signalled an actionable commitment to promote equal access to the internet for ALL, irrespective of the demographic characteristic. Growing up in a resource constrained India in the nineties with limited access to global opportunities, this precise mission truly resonated with me.

Technology should always be in service of humanity – an ethos that has guided my life as a technologist, as a citizen and as a first time co-founder of Credo.ai. Over the years, I have seen the deepened connection between my values and Mozilla’s commitment. I had come to Mozilla as a user for the secure, fast and open product, but I stayed because of this alignment of missions. And today, I’m very honored to join Mozilla’s Board.

Growing up in India, having worked globally and lived in the United States for the past two decades, I have first hand witnessed the power of informed communities and transparent technologies to drive innovation and change. It is my belief that true societal transformation happens when we empower our people, give them the right tools and the agency to create. Since its infancy Mozilla has enabled exactly that, by creating an open internet to serve people first, where individuals can shape their own empowered experiences.

Though I am excited about all the areas of Mozilla’s impact, I joined the Mozilla board to strategically support the leaders in Mozilla’s next frontier – supporting it’s theory of change for pursuing more trustworthy Artificial Intelligence.

Mozilla has, from the beginning, rejected the idea of the black box by creating a transparent and open ecosystem making visible all the inner working and decision making within its organizations and products. I am beyond excited to see that this is the same mindset (of transparency and accountability) the Mozilla leaders are bringing to their initiatives in trustworthy Artificial Intelligence (AI).

AI is a defining technology of our times which will have a broad impact on every aspect of our lives. Mozilla is committed to mobilizing public awareness and demand for more responsible AI technology especially in consumer products. In my new role as a Mozilla Foundation Board Member, I am honored to support Mozilla’s AI mission, its partners and allies around the world to build momentum for a responsible and trustworthy digital world.

Today the world crumbles under the weight of multiple pandemics – racism, misinformation, coronavirus – powered and resolved by people and technology. Now more than ever the internet and technology needs to bring equal opportunity, verifiable facts, human dignity, individual expression and collaboration among diverse communities to serve humanity. Mozilla has championed for these tenants and brought about change for decades. Now with it’s frontier focus on trustworthy AI, I am excited to see the continued impact it brings to our world.

We are at a transformational intersection in our lives where we need to critically examine and explore our choices around technology to serve our communities. How can we build technology that is demonstrably worthy of trust? How can we empower people to design systems for transparency and accountability? How can we check the values and biases we are bringing to building this fabric of frontier technology? How can we build diverse communities to catalyze change? How might we build something better, a better world through responsible technology? These questions have shaped my journey. I hope to bring this learning mindset and informed action in service of the Mozilla board and its trustworthy AI mission.

The post Why I’m Joining the Mozilla Board appeared first on The Mozilla Blog.

Categorieën: Mozilla-nl planet

First Steps Toward Lasting Change

Mozilla Blog - do, 18/06/2020 - 21:39

In this moment of rapid change, we recognize that the relics of racism exist. The actions we have seen most recently are not isolated actions. Racial injustice affects all aspects of life in our society, our collective progress has been insufficient, Mozilla’s progress has been insufficient. As we said earlier this month, we have work to do.

Today, we are sharing a set of commitments that are a starting point for three areas where we will drive change across Mozilla:

1. Who we are: Our employee base and our communities

To begin, we are committed to significantly increasing Black and Latinx representation in Mozilla in the next two years. We will:

  • Double the percentage of Black and Latinx representation of our U.S. staff. This is a starting point for what Mozilla should look like, not an aspirational end point, and it applies to all levels of the organization.
  • Increase Black representation in the U.S. at the leadership level, aiming for 6% Black employees at the Director level and up, as well as representation on Mozilla Corporation and Mozilla Foundation boards.
  • Create dedicated and comprehensive recruiting, development and inclusion efforts that attract and retain Black and Latinx Mozillans.

These commitments are not just about numbers, but about people, and that means having an environment that is diverse, inclusive and welcoming and addresses issues in people’s lives. Our work ahead is in hiring and retaining and also in providing the resources to mentor, develop and advance diverse employees, as well as ongoing education and reflection for our full staff, so that we can create the environment that reflects our mission and our users.

2. What we build: Our outreach with our products

Educating ourselves is how we can begin dismantling systemic racism, and to do that we started with surfacing content via Pocket through Firefox. These collections of works by Black writers and thought leaders are being distributed through our Pocket product with companion promotion through Firefox product messaging. It was new for us to use our products in this way. We will continue to explore how we can leverage the functionality and reach of our products and services to advance change.

Our user research and understanding of our users, their stories and problems also need broadening. We see this as a journey, with undoubtedly other ways that our products can contribute more.

3. What we do beyond products: Our broader engagement with the world

How Mozilla shows up in the world and engages to uplift and increase Black voices in the broader efforts to build a better internet, beyond just our own teams, is equally important. We have supported organizations working at the intersection of tech and racial justice such as the ACLU, Color of Change and Astraea Foundation. We’ve already committed to further work at the intersection of technology and racial justice in 2020 because it helps us build a bigger and stronger movement for a healthy internet.

Beyond those existing partnerships, we are also committing to:

  • Direct at least 40% of Mozilla Foundation grants in 2020 to Black-led projects or organizations, with specific targets to come for 2021 and beyond. We see this as critical to the transformation of our organization and the broader healthy internet movement we are part of.
  • Develop and invest in new college engagement programs with Historically Black Colleges and Universities (HBCUs) and Black student networks. We will work closely with professors and students on topics like open source and trustworthy AI, and connect them to the Mozilla community. Mozilla is committed to a culture shift in tech.
  • Focus Mozilla’s brand and social media efforts on lifting up people and organizations standing for Black lives and communities, especially where they’re working at the intersection of technology and racial justice.

By committing to change who we are, what we build and what we do beyond our products, we are talking about transforming how Mozilla shows up in the world in fundamental ways. Making this change will require us to support each other, to allow for mistakes and to embrace learning. But most of all it will require us to focus tenaciously on our values and lean into the idea that we’re creating an open internet for all. This isn’t just essential for this moment in time. It’s critical for the future of Mozilla, the future of the internet and the future of our society.

The post First Steps Toward Lasting Change appeared first on The Mozilla Blog.

Categorieën: Mozilla-nl planet

Introducing Firefox Private Network’s VPN – Mozilla VPN

Mozilla Futurereleases - do, 18/06/2020 - 18:00

We are now spending more time online than ever. At Mozilla, we are working hard to build products to help you take control of your privacy and stay safe online. To help us better understand your needs and challenges, we reached out to you — our users and supporters of Firefox Private Network.

We learned from you and our peers that many of you want to feel safer online without jumping through hoops and decided to start with the goal of providing device-level protection. This is why we built the Firefox Private Network’s Mozilla VPN, to help you control how your data is shared within your network. Although there are a lot of VPNs out there, we felt like you deserved a VPN with the Mozilla name behind it.

To build the best VPN, we turned to you again. After all, who knows you better than you, right? We started recruiting Beta testers in 2019. It was amazing to see the recruitment attract potential testers from over 200 countries around the world.

We started working with a small group of you and learned a lot. With the VPN in your hands, we confirmed some of our initial hypotheses and identified important priorities for the future. For example, over 70% of early Beta-testers say that the VPN helps them feel empowered, safe and independent while online. In addition, 83% of early Beta-testers found the VPN easy to use.

We know that we are on the right path to building a VPN that makes your online experience safer and easier to manage. We’ll keep making the right decisions for you guided by our Data Privacy Principles. This means that we are actively forgoing revenue streams and additional profit-making opportunities by committing to never track your browsing activities and avoiding any third party in-app data analytics platforms.

Your feedback also helped us identify ways to make the VPN more impactful and privacy-centric, which includes building features like split tunneling and making it available on Mac clients. The VPN will exit Beta phase in the next few weeks, move out of the Firefox Private Network brand, and become a stand-alone product, Mozilla VPN, to serve a larger audience.

To our Beta-testers, we would like to thank you for working with us. Your feedback and support made it possible for us to launch Mozilla VPN.

We are working hard to make the official product, the Mozilla VPN, available in selected regions this year. We will continue to offer the Mozilla VPN at the current pricing model for a limited time, which allows you to protect up to five devices on Windows, Android, and iOS at $4.99/month.

As we realize our vision to provide next-generation privacy and security solutions, we would like to invite you to continue to share your thoughts with us. Follow our journey through this blog, and stay connected via the waitlist here*. If you are interested to learn more about the product, the Beta Mozilla VPN is available for download in the U.S. now.

*For users outside of the U.S., We will only contact you with product updates when Mozilla VPN becomes available for your region and device.

From The Firefox Private Network Mozilla VPN Team

The post Introducing Firefox Private Network’s VPN – Mozilla VPN appeared first on Future Releases.

Categorieën: Mozilla-nl planet

Mozilla Addons Blog: Friend of Add-ons: Juraj Mäsiar

Mozilla planet - ma, 15/06/2020 - 17:32

Our newest Friend of Add-ons is Juraj Mäsiar! Juraj is the developer of several extensions for Firefox, including Scroll Anywhere, which is part of our Recommended Extensions program. He is also a frequent contributor on our community forums, where he offers friendly advice and input for extension developers looking for help.

Juraj first started building extensions for Firefox in 2016 during a quiet weekend trip to his hometown. The transition to the WebExtensions API was less than a year away, and developers were starting to discuss their migration plans. After discovering many of his favorite extensions weren’t going to port to the new API, Juraj decided to try the migration process himself to give a few extensions a second life.  “I was surprised to see it’s just normal JavaScript, HTML and CSS — things I already knew,” he says. “I put some code together and just a few moments later I had a working prototype of my ScrollAnywhere add-on. It was amazing!”

Juraj immersed himself in exploring the WebExtensions API and developing extensions for Firefox. It wasn’t always a smooth process, and he’s eager to share some tips and tricks to make the development experience easier and more efficient. “Split your code to ES6 modules. Share common code between your add-ons — you can use `git submodule` for that. Automate whatever can be automated. If you don’t know how, spend the time learning how to automate it instead of doing it manually,” he advises. Developers can also save energy by not reinventing the wheel. “If you need a build script, use webpack. Don’t build your own DOM handling library. If you need complex UI, use existing libraries like Vue.js.”

Juraj recommends staying active, saying. “Doing enough sport every day will keep your mind fresh and ready for new challenges.” He stays active by playing VR games and rollerblading.

Currently, Juraj is experimenting with the CryptoAPI and testing it with a new extension that will encrypt user notes and synchronize them with Firefox Sync. The goal is to create a secure extension that can be used to store sensitive material, like a server configuration or a home wifi password.

On behalf of the Add-ons Team, thank you for all of your wonderful contributions to our community, Juraj!

If you are interested in getting involved with the add-ons community, please take a look at our current contribution opportunities.

The post Friend of Add-ons: Juraj Mäsiar appeared first on Mozilla Add-ons Blog.

Categorieën: Mozilla-nl planet

Christopher Arnold: Money, friction and momentum on the web

Mozilla planet - zo, 14/06/2020 - 05:21
Back when I first moved to Silicon Valley in 1998, I tried to understand how capital markets made the valley such a unique place for inventors and entrepreneurs.  Corporate stocks, real estate, international currency and commodities markets were concepts I was well familiar with from my time working at a financial news service in the nation's capital in the mid 1990's.  However, crowdfunding and angel investing were new concepts to me 20 years ago.  Crowdfunding platforms seemed to be more to the advantage of the funding recipient than the balanced two-sided exchanges of the commercial financial system.  I often wondered what motivated generosity-driven models that was different from reward-driven sponsorships.

When trying to grasp the way angel investors think about entrepreneurship, my friend Willy, a serial entrepreneur and investor, said: “If you want to see something succeed, throw money at it!”  The idea behind the "angel" is that they are the riskiest of risk-capital.  Angel investors join the startup funding before banks and venture capital firms.  They seldom get payback in kind from the companies they sponsor and invest in.  Angels are the lenders of first-resort for founders because they tend to be more generous, more flexible and more forgiving than lenders.  They value the potential success of the venture far more than they value the money they put forth.  And the contributions of an angel investor can have an outsized benefit in the early stage of an initiative by sustaining the founder/creator at their most vulnerable stage.  But what is this essence they get out of it that is worth more than money to them?

Over the course of the last couple of decades I've become a part of the crowdfunding throng of inventors and sponsors.  I have contributed to small business projects on Kiva in over 30 countries, and backed many small-scale projects across Kickstarter, Indiegogo and Appbackr.  I've also been on the receiving side, having the chance to pitch my company for funding on Sand Hill Road, the strip of financial lending firms that populate Palo Alto's hillsides.  As a funder, it has been very enlightening to know that I can be part of someone else's project by chipping in time, sharing insights and capital to get improbable projects off the ground.  And the excitement of following the path of the entrepreneurs has been the greatest reward.  As a founder, I remember framing the potential of a future that, if funded, would yield significant returns to the lenders and shareholders.  Of course, the majority of new ventures do not come to market in the form of their initial invention.  Some of the projects I participated in have launched commercially and I've been able to benefit.  (By getting shares in a growing venture or by getting nifty gadgets and software as part of the pre-release test audience.)  But those things aren't the reward I was seeking when I signed up.  It was the energy of participating in the innovation process and the excitement about making a difference.  After many years of working in the corporate world, I became hooked on the idea of working with engineers and developers who are bringing about the next generation of our expressive/experiential platforms of the web.

During the Augmented World Expo in May, I attended a conference session called "Web Monetization and Social Signaling," hosted by Anselm Hook, a researcher at the web development non-profit Mozilla, where I also work.  He made an interesting assertion during his presentation, "Money appears to be a form of communication."  His study was observing platform-integrated social signals (such as up-voting, re-tweeting and applauding with hand-clapping emojis) to draw attention to content users had discovered on the web, in this case within the content recommendation platform of the Firefox Reality VR web browser.  There are multiple motivations and benefits for this kind of social signaling.  It serves as a bookmarking method for the user, it increases the content's visibility to friends who might also like the content, it signals affinity with the content as part of one's own identity and it gives reinforcement to the content/comment provider.  Anselm found in his research that participants actually reacted more strongly when they believed their action contributed financial benefit directly to the other participant.  Meaning, we don't just want to use emojis to make each other feel good about their web artistry.  In some cases, we want to cause profit for the artist/developer directly.  Perhaps a gesture of a smiley-face or a thumb is adequate to assuage our desire to give big-ups to an artist, and we can feel like our karmic balance book is settled.  But what if we want to do more than foist colored pixels on each other?  Could the web do more to allow us to financially sustain the artist wizards behind the curtain?  Can we "tip" the way we do our favorite street musicians?  Not conveniently, because the systems we have now mostly rely on the credit card.  But in the offline context, do we interrupt a street busker to ask for their Venmo or Paypal account?  We typically use cash, which has only rough analogues as of yet in our digital lives.

When I lived in Washington DC, I had the privilege to see the great Qawwali master Nusrat Fateh Ali Khan in concert.  Qawwali is a style of inspired Sufi mystical chant combined with call-and-response singing with a backup ensemble.  Listening for hours as his incantations built from quiet mutterings accompanied by harmonium and slow paced drums to a crescendo of shouts and wails of devotion at the culmination of his songs was very transporting in spite of my dissimilar cultural upbringing and language.  What surprised me, beyond the amazing performance of course, was that as the concert progressed people in the audience would get up, dance and then hurl money at the stage.  "This is supposed to be a devotional setting isn't it?  Hurling cash at the musicians seems so profane," I thought.  But apparently this is something that one does at concerts in Pakistan.  The relinquishing of cash is devotional, like Thai Buddhists offering gold leaf by pressing it into the statues of their teachers and monks.  Money is a form of communication of the ineffable appreciation we feel toward those of greatness in the moment of connection or the moment of realization of our indebtedness.  Buying is a different form of expression that is personal but not expressive.  When we buy, it is disconnected from artistry of the moment.  No lesser appreciation for sure.  It's different because it isn't social signaling, it's coveting.  When in concerts or in real-time scenarios we transmit our bounty upon another, it is an act of making sacrifice and conferring benefit.  The underlying meaning of it may be akin to "I hope you succeed!" or, "I relinquish my having so that you might have."  I'm glossing over the cultural complexity of the gesture surely.  Japanese verbs have subtle ways to distinguish the transfer/receipt of benefit according to seniority, societal position and degree of humility: Giving upward "ageru/agemasu", giving downward "kudasai/kudasaru", giving laterally "kureru/morau"  The psychological subtlety of the transfer of boons between individuals is scripted deeply within us, all the more accentuating how a plastic card or a piece of paper barely captures the breadth of expression we caring animals have.

The web of yesteryear has done a really good job of covering the coveting use case.  Well done web! Now, what do we build for an encore?  How can we emulate the other expressions of human intent that coveting and credit cards don't cover?

In the panic surrounding the current Covid pandemic, I felt a sense of being disconnected from the community I usually am rooted in.  I sought information about those affected internationally in the countries I've visited and lived in, where my friends and favorite artists live.  I sought out charitable organizations positioned there and gave them money, as it was the least I felt I could do to reach those impacted by the crisis remote from me.  Locally, my network banded together to find ways that we could mobilize to help those affected in our community.  We found that using the metaphor of "gift cards" (a paper coupon) could be used to foist cash quickly into the coffers of local businesses so they could meet short-term spending needs to keep their employees paid and their businesses operational even while their shops were forced into closure in the interest of posterity.   I found the process very slow and cumbersome as I had to write checks, give out credit cards (to places I never would typically share sensitive financial data) find email addresses for people to transmit PayPal to, and in come cases I had to resort to paper cash for those whom the web could not reach.

This experience made me keenly aware that the systems we have on the web don't replicate the ways we think and the ways that we express our generosity in the modern world.  As web developers, we need to enable a new kind of gesture akin to the act of tipping with cash in offline society.  Discussing this with my friend Aneil, he asserted that both anonymous donor platforms like Patreon and other block-chain currencies can fit the bill for addressing the donor need, if the recipient is set up to receive them. He cautioned that online transactions are held to a different standard than cash in US society because of “Know Your Customer” regulation which was put in place to stem the risk of money laundering through anonymous transactions. As we discussed the idea of peer-to-peer transactions in virtual environments, he pointed out, ”The way game companies get around that is to have consumers purchase in game credits that cannot be converted back into money.” The government is fine with people putting money into something. It’s the extraction from the flow of exchange in monetary sense that needs to be subject to the regulations designed for taxation and investment controls.

Patreon, like PayPal, is a cash-value paired system while virtual currencies such as Bitcoin, BAT and Etherium can be variable in exchange value for their coin. Blockchain ledger transactions trace exactly who gave what to whom. So, they are in theory able to comply with KYC restrictions even in situations where the exchange is relatively anonymous. Yet they are wildly different in terms of how the currency holders perceive their value. Aneil pointed out that Bitcoin is bad for online transactions because its scarcity model incentivizes people to hold onto it. It’s like gold, a slow currency. A valuable crypto currency therefore would slow down rather than facilitating donation and tipping. You need a currency that people are comfortable to hold for only short periods of time like the funds in a Kiva or Patreon wallet. If people are always withdrawing from the currency for fear of its losing value, then the currency itself isn’t stable enough to be the basis of a robust transaction system. For instance, when I was in Zimbabwe, where inflation is incredibly high for their paper currency, people wanted to get rid of it quickly for some other asset that lost value slower than the paper notes. Similarly, Aneil pointed out, any coin that you use to transact virtually could suffer the incentive to cash out quickly, which would drive the value of the asset in a fluid marketplace lower. Cash proxies don’t have an inherent value unless they are underpinned by an artificial or perceived scarcity mechanism.  The US government has an agency, the Federal Reserve, whose mission it is to ensure that money depreciates slowly enough that the underlying credit of the government stays stable and encourages growth of its economy.  Any other currency system would need the same.  Bitcoin can't be it because of its exceedingly high scarcity which leads to hoarding.  Until web developers solve this friction problem, web transactions and therefore web authorship will be stifled of support it needs to grow.

Understanding this underlying problem of financial sustainability, my colleague Anselm is working with crypto-currency enabler Coil to try to apply cyrpto-currency sponsorship to peer and creator/recipient exchanges on the web.  He envisions a future where users could casually exchange funds in a virtual, web-based or real-world "augmented reality" transaction without needing to exchange credit card or personal account data.  This may sound mundane, because the use-case and need is obvious, as we're used to doing it with cash every day.  The question it begs is, why can't the web do this?#!  Why do I need to exchange credit cards (sensitive data) or email (not sensitive but not public) if I just want to send credits or tips to somebody?  There was an early success in this kind of micropayments model when Anshe Chung became the world's first self-made millionaire by selling virtual goods to Second Life enthusiasts.  The LindenLabs virtual platform had the ability for users to pay money to other peer users inside the virtual environment.  With a bit more development collaboration, this kind of model may be beneficial to others outside of specific game environments.

Anselm's speech at AWE was to introduce the concept of a "tip-jar," something we're familiar with from colloquial offline life, for the nascent developer ecosystem of virtual and augmented reality web developers.  For most people who are used to high-end software being sold as apps in a marketplace like iTunes or Android Play Store, the idea that we would pay web pages may seem peculiar.  But it's not too far a leap from how we generally spend our money in society.  Leaving tips with cash is common practice for Americans.  Even when service fees are not required, Americans tend to tip generously.  Lonely Planet dedicates sections of its guidebooks to concepts of money and I've typically seen that Americans have a looser idea of tip amount than other countries.

Anselm and the team managing the "Grant for the Web" hope to bring this kind of peer-to-peer mechanism to the broader web around us by utilizing Coil's grant of $100 Million in crypto-currency toward achieving this vision. 

If you're interested in learning more about web-monetization initiative from Coil and Mozilla please visit: https://www.grantfortheweb.org/








Categorieën: Mozilla-nl planet

Daniel Stenberg: curl meets gold level best practices

Mozilla planet - za, 13/06/2020 - 22:43

About four years ago I announced that curl was 100% compliant with the CII Best Practices criteria. curl was one of the first projects on that train to reach a 100% – primarily of course because we were early joiners and participants of the Best Practices project.

The point of that was just to highlight and underscore that we do everything we can in the curl project to act as a responsible open source project and citizen of the larger ecosystem. You should be able to trust curl, in every aspect.

Going above and beyond basic

Subsequently, the best practices project added higher levels of compliance. Basically adding a bunch of requirements so if you want to grade yourself at silver or even gold level there are a whole series of additional requirements to meet. At the time those were added, I felt they were asking for quite a lot of specifics that we didn’t provide in the curl project and with a bit of a sigh I felt I had to accept the fact that we would remain on “just” 100% compliance and only reaching a part of the way toward Silver and Gold. A little disheartened of course because I always want curl to be in the top.

So maybe Silver?

I had left the awareness of that entry listing in a dusty corner of my brain and hadn’t considered it much lately, when I noticed the other day that it was announced that the Linux kernel project reached gold level best practice.

That’s a project with around 50 times more developers and commits than curl for an average release (and even a greater multiplier for amount of code) so I’m not suggesting the two projects are comparable in any sense. But it made me remember our entry on CII Best Practices web site.

I came back, updated a few fields that seemed to not be entirely correct anymore and all of a sudden curl quite unexpectedly had a 100% compliance at Silver level!

Further?

If Silver was achievable, what’s actually left for gold?

Sure enough, soon there were only a few remaining criteria left “unmet” and after some focused efforts in the project, we had created the final set of documents with information that were previously missing. When we now finally could fill in links to those docs in the final few entries, project curl found itself also scoring a 100% at gold level.

Best Practices: Gold Level

What does it mean for us? What does it mean for you, our users?

For us, it is a double-check and verification that we’re doing the right things and that we are providing the right information in the project and we haven’t forgotten anything major. We already knew that we were doing everything open source in a pretty good way, but getting a bunch of criteria that insisted on a number of things also made us go the extra way and really provide information for everything in written form. Some of what previously really only was implied, discussed in IRC or read between lines in various pull requests.

I’m proud to lead the curl project and I’m proud of all our maintainers and contributors.

For users, having curl reach gold level makes it visible that we’re that kind of open source project. We’re part of this top clique of projects. We care about every little open source detail and this should instill trust and confidence in our users. You can trust curl. We’re a golden open source project. We’re with you all the way.

The final criteria we checked off

Which was the last criteria of them all for curl to fulfill to reach gold?

The project MUST document its code review requirements, including how code review is conducted, what must be checked, and what is required to be acceptable (link)

This criteria is now fulfilled by the brand new document CODE_REVIEW.md. What’s next?

We’re working on the next release. We always do. Stop the slacking now and get back to work!

Credits

Gold image by Erik Stein from Pixabay

Categorieën: Mozilla-nl planet

Patrick Cloke: Raspberry Pi File Server

Mozilla planet - vr, 12/06/2020 - 22:55

This is just some quick notes (for myself) of how I recently setup my Raspberry Pi as a file server. The goal was to have a shared folder so that a Sonos could play music from it. The data would be backed via a microSD card plugged into USB.

  1. Update …
Categorieën: Mozilla-nl planet

Data@Mozilla: This Week in Glean: Project FOG Update, end of H12020

Mozilla planet - vr, 12/06/2020 - 22:28

(“This Week in Glean” is a series of blog posts that the Glean Team at Mozilla is using to try to communicate better about our work. They could be release notes, documentation, hopes, dreams, or whatever: so long as it is inspired by Glean. You can find an index of all TWiG posts online.)

It’s been a while since last I wrote on Project FOG, so I figure I should update all of you on the progress we’ve made.

A reminder: Project FOG (Firefox on Glean) is the year-long effort to bring the Glean SDK to Firefox. This means answering such varied questions as “Where are the docs going to live?” (here) “How do we update the SDK when we need to?” (this way) “How are tests gonna work?” (with difficulty) and so forth. In a project this long you can expect updates from time-to-time. So where are we?

First, we’ve added the Glean SDK to Firefox Desktop and include it in Firefox Nightly. This is only a partial integration, though, so the only builtin ping it sends is the “deletion-request” ping when the user opts out of data collection in the Preferences. We don’t actually collect any data, so the ping doesn’t do anything, but we’re sending it and soon we’ll have a test ensuring that we keep sending it. So that’s nice.

Second, we’ve written a lot of Design Proposals. The Glean Team and all the other teams our work impacts are widely distributed across a non-trivial fragment of the globe. To work together and not step on each others’ toes we have a culture of putting most things larger than a bugfix into Proposal Documents which we then pass around asynchronously for ideation, feedback, review, and signoff. For something the size and scope of adding a data collection library to Firefox Desktop, we’ve needed more than one. These design proposals are Google Docs for now, but will evolve to in-tree documentation (like this) as the proposals become code. This way the docs live with the code and hopefully remain up-to-date for our users (product developers, data engineers, data scientists, and other data consumers), and are made open to anyone in the community who’s interested in learning how it all works.

Third, we have a Glean SDK Rust API! Sorta. To limit scope creep we haven’t added the Rust API to mozilla/glean and are testing its suitability in FOG itself. This allows us to move a little faster by mixing our IPC implementation directly into the API, at the expense of needing to extract the common foundation later. But when we do extract it, it will be fully-formed and ready for consumers since it’ll already have been serving the demanding needs of FOG.

Fourth, we have tests. This was a bit of a struggle as the build order of Firefox means that any Rust code we write that touches Firefox internals can’t be tested in Rust tests (they must be tested by higher-level integration tests instead). By damming off the Firefox-adjacent pieces of the code we’ve been able to write and run Rust tests of the metrics API after all. Our code coverage is still a little low, but it’s better than it was.

Fifth, we are using Firefox’s own network stack to send pings. In a stroke of good fortune the application-services team (responsible for fan-favourite Firefox features “Sync”, “Send Tab”, and “Firefox Accounts”) was bringing a straightforward Rust networking API called Viaduct to Firefox Desktop almost exactly when we found ourselves in need of one. Plugging into Viaduct was a breeze, and now our “deletion-request” pings can correctly work their way through all the various proxies and protocols to get to Mozilla’s servers.

Sixth, we have firm designs on how to implement both the C++ and JS APIs in Firefox. They won’t be fully-fledged language bindings the way that Kotlin, Python, and Swift are (( they’ll be built atop the Rust language binding so they’re really more like shims )), but they need to have every metric type and every metric instance that a full language binding would have, so it’s no small amount of work.

But where does that leave our data consumers? For now, sadly, there’s little to report on both the input and output sides: We have no way for product engineers to collect data in Firefox Desktop (and no pings to send the data on), and we have no support in the pipeline for receiving data, not that we have any to analyse. These will be coming soon, and when they do we’ll start cautiously reaching out to potential first customers to see whether their needs can be satisfied by the pieces we’ve built so far.

And after that? Well, we need to do some validation work to ensure we’re doing things properly. We need to implement the designs we proposed. We need to establish how tasks accomplished in Telemetry can now be accomplished in the Glean SDK. We need to start building and shipping FOG and the Glean SDK beyond Nightly to Beta and Release. We need to implement the builtin Glean SDK pings. We need to document the designs so others can understand them, best practices so our users can follow them, APIs so engineers can use them, test guarantees so QA can validate them, and grand processes for migration from Telemetry to Glean so that organizations can start roadmapping their conversions.

In short: plenty has been done, and there’s still plenty to do.

I guess we’d better be about it, then.

:chutten

(( this is a syndicated copy of the original post ))

Categorieën: Mozilla-nl planet

Chris H-C: This Week in Glean: Project FOG Update, end of H12020

Mozilla planet - vr, 12/06/2020 - 22:24

(“This Week in Glean” is a series of blog posts that the Glean Team at Mozilla is using to try to communicate better about our work. They could be release notes, documentation, hopes, dreams, or whatever: so long as it is inspired by Glean. You can find an index of all TWiG posts online.)

It’s been a while since last I wrote on Project FOG, so I figure I should update all of you on the progress we’ve made.

A reminder: Project FOG (Firefox on Glean) is the year-long effort to bring the Glean SDK to Firefox. This means answering such varied questions as “Where are the docs going to live?” (here) “How do we update the SDK when we need to?” (this way) “How are tests gonna work?” (with difficulty) and so forth. In a project this long you can expect updates from time-to-time. So where are we?

First, we’ve added the Glean SDK to Firefox Desktop and include it in Firefox Nightly. This is only a partial integration, though, so the only builtin ping it sends is the “deletion-request” ping when the user opts out of data collection in the Preferences. We don’t actually collect any data, so the ping doesn’t do anything, but we’re sending it and soon we’ll have a test ensuring that we keep sending it. So that’s nice.

Second, we’ve written a lot of Design Proposals. The Glean Team and all the other teams our work impacts are widely distributed across a non-trivial fragment of the globe. To work together and not step on each others’ toes we have a culture of putting most things larger than a bugfix into Proposal Documents which we then pass around asynchronously for ideation, feedback, review, and signoff. For something the size and scope of adding a data collection library to Firefox Desktop, we’ve needed more than one. These design proposals are Google Docs for now, but will evolve to in-tree documentation (like this) as the proposals become code. This way the docs live with the code and hopefully remain up-to-date for our users (product developers, data engineers, data scientists, and other data consumers), and are made open to anyone in the community who’s interested in learning how it all works.

Third, we have a Glean SDK Rust API! Sorta. To limit scope creep we haven’t added the Rust API to mozilla/glean and are testing its suitability in FOG itself. This allows us to move a little faster by mixing our IPC implementation directly into the API, at the expense of needing to extract the common foundation later. But when we do extract it, it will be fully-formed and ready for consumers since it’ll already have been serving the demanding needs of FOG.

Fourth, we have tests. This was a bit of a struggle as the build order of Firefox means that any Rust code we write that touches Firefox internals can’t be tested in Rust tests (they must be tested by higher-level integration tests instead). By damming off the Firefox-adjacent pieces of the code we’ve been able to write and run Rust tests of the metrics API after all. Our code coverage is still a little low, but it’s better than it was.

Fifth, we are using Firefox’s own network stack to send pings. In a stroke of good fortune the application-services team (responsible for fan-favourite Firefox features “Sync”, “Send Tab”, and “Firefox Accounts”) was bringing a straightforward Rust networking API called Viaduct to Firefox Desktop almost exactly when we found ourselves in need of one. Plugging into Viaduct was a breeze, and now our “deletion-request” pings can correctly work their way through all the various proxies and protocols to get to Mozilla’s servers.

Sixth, we have firm designs on how to implement both the C++ and JS APIs in Firefox. They won’t be fully-fledged language bindings the way that Kotlin, Python, and Swift are (( they’ll be built atop the Rust language binding so they’re really more like shims )), but they need to have every metric type and every metric instance that a full language binding would have, so it’s no small amount of work.

But where does that leave our data consumers? For now, sadly, there’s little to report on both the input and output sides: We have no way for product engineers to collect data in Firefox Desktop (and no pings to send the data on), and we have no support in the pipeline for receiving data, not that we have any to analyse. These will be coming soon, and when they do we’ll start cautiously reaching out to potential first customers to see whether their needs can be satisfied by the pieces we’ve built so far.

And after that? Well, we need to do some validation work to ensure we’re doing things properly. We need to implement the designs we proposed. We need to establish how tasks accomplished in Telemetry can now be accomplished in the Glean SDK. We need to start building and shipping FOG and the Glean SDK beyond Nightly to Beta and Release. We need to implement the builtin Glean SDK pings. We need to document the designs so others can understand them, best practices so our users can follow them, APIs so engineers can use them, test guarantees so QA can validate them, and grand processes for migration from Telemetry to Glean so that organizations can start roadmapping their conversions.

In short: plenty has been done, and there’s still plenty to do. 

I guess we’d better be about it, then.

:chutten

Categorieën: Mozilla-nl planet

Daniel Stenberg: 800 authors and counting

Mozilla planet - vr, 12/06/2020 - 17:24

Today marks the day when we merged the commit authored by the 800th person in the curl project.

We turned 22 years ago this spring but it really wasn’t until 2010 when we switched to git when we started to properly keep track of every single author in the project. Since then we’ve seen a lot of new authors and a lot of new code.

The “explosion” is clearly visible in this graph generated with fresh data just this morning (while we were still just 799 authors). See how we’ve grown maybe 250 authors since 1 Jan 2018.

Author number 800 is named Nicolas Sterchele and he submitted an update of the TODO document. Appreciated!

As the graph above also shows, a majority of all authors only ever authored a single commit. If you did 10 commits in the curl project, you reach position #61 among all the committers while 100 commits takes you all the way up to position #13.

Become one!

If you too want to become one of the cool authors of curl, I fine starting point for that journey could be the Help Us document. If that’s not enough, you’re also welcome to contact me privately or maybe join the IRC channel for some socializing and “group mentoring”.

If we keep this up, we could reach a 1,000 authors in 2022…

Categorieën: Mozilla-nl planet

Cameron Kaiser: TenFourFox FPR23 for Intel available

Mozilla planet - vr, 12/06/2020 - 06:20
Ken Cunningham figured out the build issues he was having with the Intel version and has updated TenFourFox for Intel systems to FPR23, now up to date with the Power Mac version. As always, there is no support for any Intel build of TenFourFox; do not report issues to Tenderapp. You can get it from SourceForge.

Ken's patches have also been incorporated into the tree along with a workaround submitted by Raphaël Guay to deal with Twitch overflowing our JIT stack. This is probably due to something we don't support causing infinite function call recursion since with the JIT disabled it correctly just runs out of stack and stops. There is no way to increase stack further since we are strictly 32-bit builds and the stack already consumes 1GB of our 2.2-ish GB available, so we need to a) figure out why the stack overflow happens without being detected and b) temporarily disable that script until we do. It's part B that is implemented as a second blacklist which is on unless disabled, since other sites may do this, until we find a better solution to part A. This will be in FPR24 along with probably some work on MP3 compliance issues since TenFourFox gets used as a simple little Internet radio a lot more than I realized, and a few other odds and ends.

In case you missed it, I am now posting content I used to post here as "And now for something completely different" over on a new separate blog christened Old Vintage Computing Research, or my Old VCR (previous posts will remain here indefinitely). Although it will necessarily have Power Mac content, it will also cover some of my other beloved older systems all in one place. Check out properly putting your old Mac to Terminal sleep (and waking it back up again), along with screenshots of the unscreenshotable, including grabs off the biggest computer Apple ever made, the Apple Network Server. REWIND a bit and PLAY.

Categorieën: Mozilla-nl planet

Mozilla Addons Blog: Recommended extensions — recent additions

Mozilla planet - do, 11/06/2020 - 22:10

When the Recommended Extensions program debuted last year, it listed about 60 extensions. Today the program has grown to just over a hundred as we continue to evaluate new nominations and carefully grow the list. The curated collection grows slowly because one of the program’s goals is to cultivate a fairly fixed list of content so users can feel confident the Recommended extensions they install will be monitored for safety and security for the foreseeable future.

Here are some of the more exciting recent additions to the program…

DuckDuckGo Privacy Essentials provides a slew of great privacy features, like advanced ad tracker and search protection, encryption enforcement, and more.

Read Aloud: Text to Speech converts any web page text (even PDF’s) to audio. This can be a very useful extension for everyone from folks with eyesight or reading issues to someone who just wants their web content narrated to them while their eyes roam elsewhere.

SponsorBlock addresses the nuisance of this newer, more intrusive type of video advertising.

SponsorBlock for YouTube is one of the more original content blockers we’ve seen in a while. Leveraging crowdsourced data, the extension skips those interruptive sponsored content segments of YouTube clips.

Metastream Remote has been extremely valuable to many of us during pandemic related home confinement. It allows you to host streaming video watch parties with friends. Metastream will work with any video streaming platform, so long as the video has a URL (in the case of paid platforms like Netflix, Hulu, or Disney+, they too will work provided all watch party participants have their own accounts).

Cookie AutoDelete summarizes its utility right in the title. This simple but powerful extension will automatically delete your cookies from closed tabs. Customization features include whitelist support and informative visibility into the number of cookies used on any given site.

AdGuard AdBlocker is a popular and highly respected content blocker that works to block all ads—banner, video, pop-ups, text ads—all of it. You may also notice the nice side benefit of faster page loads, since AdGuard prohibits so much content you didn’t want anyway.

If you’re the creator of an extension you feel would make a strong candidate for the Recommended program, or even if you’re just a huge fan of an extension you think merits consideration, please submit nominations to amo-featured [at] mozilla [dot] org. Due to the high volume of submissions we receive, please understand we’re unable to respond to every inquiry.

The post Recommended extensions — recent additions appeared first on Mozilla Add-ons Blog.

Categorieën: Mozilla-nl planet

Hacks.Mozilla.Org: Introducing the MDN Web Docs Front-end developer learning pathway

Mozilla planet - do, 11/06/2020 - 18:01

The MDN Web Docs Learning Area (LA) was first launched in 2015, with the aim of providing a useful counterpart to the regular MDN reference and guide material. MDN had traditionally been aimed at web professionals, but we were getting regular feedback that a lot of our audience found MDN too difficult to understand, and that it lacked coverage of basic topics.

Fast forward 5 years, and the Learning Area material is well-received. It boasts around 3.5–4 million page views per month; a little under 10% of MDN Web Docs’ monthly web traffic.

At this point, the Learning Area does its job pretty well. A lot of people use it to study client-side web technologies, and its loosely-structured, unopinionated, modular nature makes it easy to pick and choose subjects at your own pace. Teachers like it because it is easy to include in their own courses.

However, at the beginning of the year, this area had two shortcomings that we wanted to improve upon:

  1. We’d gotten significant feedback that our users wanted a more opinionated, structured approach to learning web development.
  2. We didn’t include any information on client-side tooling, such as JavaScript frameworks, transformation tools, and deployment tools widely used in the web developer’s workplace.

To remedy these issues, we created the Front-end developer learning pathway (FED learning pathway).

Structured learning

Take a look at the Front-end developer pathway linked above  — you’ll see that it provides a clear structure for learning front-end web development. This is our opinion on how you should get started if you want to become a front-end developer. For example, you should really learn vanilla HTML, CSS, and JavaScript before jumping into frameworks and other such tooling. Accessibility should be front and center in all you do. (All Learning Area sections try to follow accessibility best practices as much as possible).

While the included content isn’t completely exhaustive, it delivers the essentials you need, along with the confidence to look up other information on your own.

The pathway starts by clearly stating the subjects taught, prerequisite knowledge, and where to get help. After that, we provide some useful background reading on how to set up a minimal coding environment. This will allow you to work through all the examples you’ll encounter. We explain what web standards are and how web technologies work together, as well as how to learn and get help effectively.

The bulk of the pathway is dedicated to detailed guides covering:

  • HTML
  • CSS
  • JavaScript
  • Web forms
  • Testing and accessibility
  • Modern client-side tooling (which includes client-side JavaScript frameworks)

Throughout the pathway we aim to provide clear direction — where you are now, what you are learning next, and why. We offer enough assessments to provide you with a challenge, and an acknowledgement that you are ready to go on to the next section.

Tooling

MDN’s aim is to document native web technologies — those supported in browsers. We don’t tend to document tooling built on top of native web technologies because:

  • The creators of that tooling tend to produce their own documentation resources.  To repeat such content would be a waste of effort, and confusing for the community.
  • Libraries and frameworks tend to change much more often than native web technologies. Keeping the documentation up to date would require a lot of effort. Alas, we don’t have the bandwidth to perform regular large-scale testing and updates.
  • MDN is seen as a neutral documentation provider. Documenting tooling is seen by many as a departure from neutrality, especially for tooling created by major players such as Facebook or Google.

Therefore, it came as a surprise to some that we were looking to document such tooling. So why did we do it? Well, the word here is pragmatism. We want to provide the information people need to build sites and apps on the web. Client-side frameworks and other tools are an unmistakable part of that. It would look foolish to leave out that entire part of the ecosystem. So we opted to provide coverage of a subset of tooling “essentials” — enough information to understand the tools, and use them at a basic level. We aim to provide the confidence to look up more advanced information on your own.

New Tools and testing modules

In the Tools and testing Learning Area topic, we’ve provided the following new modules:

  1. Understanding client-side web development tools: An introduction to the different types of client-side tools that are available, how to use the command line to install and use tools. This section delivers a crash course in package managers. It includes a walkthrough of how to set up and use a typical toolchain, from enhancing your code writing experience to deploying your app.
  2. Understanding client-side JavaScript frameworks: A useful grounding in client-side frameworks, in which we aim to answer questions such as “why use a framework?”, “what problems do they solve?”, and “how do they relate to vanilla JavaScript?” We give the reader a basic tutorial series in some of the most popular frameworks. At the time of writing, this includes React, Ember, and Vue.
  3. Git and GitHub: Using links to Github’s guides, we’ve assembled a quickfire guide to Git and GitHub basics, with the intention of writing our own set of guides sometime later on.
Further work

The intention is not just to stop here and call the FED learning pathway done. We are always interested in improving our material to keep it up to date and make it as useful as possible to aspiring developers. And we are interested in expanding our coverage, if that is what our audience wants. For example, our frameworks tutorials are fairly generic to begin with, to allow us to use them as a test bed, while providing some immediate value to readers.

 

We don’t want to just copy the material provided by tooling vendors, for reasons given above. Instead we want to listen, to find out what the biggest pain points are in learning front-end web development. We’d like to see where you need more coverage, and expand our material to suit. We would like to cover more client-side JavaScript frameworks (we have already got a Svelte tutorial on the way), provide deeper coverage of other tool types (such as transformation tools, testing frameworks, and static site generators), and other things besides.

Your feedback please!

To enable us to make more intelligent choices, we would love your help. If you’ve got a strong idea abou tools or web technologies we should cover on MDN Web Docs, or you think some existing learning material needs improvement, please let us know the details! The best ways to do this are:

  1. Leave a comment on this article.
  2. Fill in our questionnaire (it should only take 5–10 minutes).

So that draws us to a close. Thank you for reading, and for any feedback you choose to share.

We will use it to help improve our education resources, helping the next generation of web devs learn the skills they need to create a better web of tomorrow.

The post Introducing the MDN Web Docs Front-end developer learning pathway appeared first on Mozilla Hacks - the Web developer blog.

Categorieën: Mozilla-nl planet

Mozilla Addons Blog: Improvements to Statistics Processing on AMO

Mozilla planet - wo, 10/06/2020 - 21:22

We’re revamping the statistics we make available to add-on developers on addons.mozilla.org (AMO).

These stats are aggregated from add-on update logs and don’t include any personally identifiable user data. They give developers information about user adoption, general demographics, and other insights that might help them make changes and improvements.

The current system is costly to run, and glitches in the data have been a long-standing recurring issue. We are addressing these issues by changing the data source, which will improve reliability and reduce processing costs.

Usage Statistics

Until now, add-on usage statistics have been based on add-on updates. Firefox checks AMO daily for updates for add-ons that are hosted there (self-distributed add-ons generally check for updates on a server specified by the developer). The server logs for these update requests are aggregated and used to calculate the user counts shown on add-on pages on AMO. They also power a statistics dashboard for developers that breaks down the usage data by language, platform, application, etc.

Stats dashboard example

Stats dashboard showing new version adoption for uBlock Origin

In a few weeks, we will stop using the daily pings as the data source for usage statistics. The new statistics will be based on Firefox telemetry data. As with the current stats, all data is aggregated and no personally identifiable user data is shared with developers.

The data shown on AMO and shared with developers will be essentially the same, but the move to telemetry means that the numbers will change a little. Firefox users can opt out of sending telemetry data, and the way they are counted is different. Our current stats system counts distinct users by IP address, while telemetry uses a per-profile ID. For most add-ons you should expect usage totals to be lower, but usage trends and fluctuations should be nearly identical.

Telemetry data will enable us to show data for add-on versions that are not listed on AMO, so all developers will now be able to analyze their add-on usage stats, regardless of how the add-on is distributed. This also means some add-ons will have higher usage numbers, since the average will be calculated including both AMO-hosted and self-hosted versions.

Other changes that will happen due to this update:

  • The dashboards will only show data for enabled installs. There won’t be a breakdown of usage by add-on status anymore.
  • A breakdown of usage by country will be added.
  • Usage data for our current Firefox for Android browser (also known as Fennec) isn’t included. We’re working on adding data for our next mobile browser (Fenix), currently in development.
  • It won’t be possible to make your statistics dashboard publicly available anymore. Dashboards will only be accessible to add-on developers and admins, starting on June 11. If you are a member of a team that maintains an add-on and you need to access its stats dashboard, please ask your team to add you as an author in the Manage Authors & License page on AMO. The Listed property can be checked off so you don’t show up in the add-on’s public listing page.

We will begin gradually rolling out the new dashboard on June 11. During the rollout, a fraction of add-on dashboards will default to show the new data, but they will also have a link to access the old data. We expect to complete the rollout and discontinue the old dashboards on July 9. If you want to export any of your old stats, make sure you do it before then.

Download Statistics

We plan to make a similar overhaul to download statistics in the coming months. For now they will remain the same. You should expect an announcement around August, when we are closer to switching over to the new download data.

The post Improvements to Statistics Processing on AMO appeared first on Mozilla Add-ons Blog.

Categorieën: Mozilla-nl planet

Pagina's