Mozilla’s DNS-over-HTTPS makes surfing safer, and improves performance

Internet group brands Mozilla ‘internet villain’ for supporting DNS privacy feature

An industry group of internet service providers has branded Firefox browser maker Mozilla an “internet villain” for supporting a DNS security standard.

The U.K.’s Internet Services Providers’ Association (ISPA), the trade group for U.K. internet service providers, nominated the browser maker for its proposed effort to roll out the security feature, which they say will allow users to “bypass UK filtering obligations and parental controls, undermining internet safety standards in the UK.”

Mozilla said late last year it was planning to test DNS-over-HTTPS to a small number of users.

Whenever you visit a website — even if it’s HTTPS enabled — the DNS query that converts the web address into an IP address that computers can read is usually unencrypted. The security standard is implemented at the app level, making Mozilla the first browser to use DNS-over-HTTPS.

By encrypting the DNS query it also protects the DNS request against man-in-the-middle attacks, which allow attackers to hijack the request and point victims to a malicious page instead.

DNS-over-HTTPS also improves performance, making DNS queries — and the overall browsing experience — faster.

But the ISPA doesn’t think DNS-over-HTTPS is compatible with the U.K.’s current website blocking regime.

Under U.K. law, websites can be blocked for facilitating the infringement of copyrighted or trademarked material or if they are deemed to contain terrorist material or child abuse imagery. In encrypting DNS queries, it’s claimed that it will make it more difficult for internet providers to filter their subscribers’ internet access.

The ISPA isn’t alone. U.K. spy agency GCHQ and the Internet Watch Foundation, which maintains the U.K.’s internet blocklist, have criticized the move to roll out encrypted DNS features to the browser.

The ISPA’s nomination quickly drew ire from the security community. Amid a backlash on social media, the ISPA doubled down on its position. “Bringing in DNS-over-HTTPS by default would be harmful for online safety, cybersecurity and consumer choice,” but said it encourages “further debate.”

One internet provider, Andrews & Arnold, donated £2,940 — around $3,670 — to Mozilla in support of the nonprofit. “The amount was chosen because that is what our fee for ISPA membership would have been, were we a member,” said a tweet from the company.

Mozilla spokesperson Justin O’Kelly told TechCrunch: “We’re surprised and disappointed that an industry association for ISPs decided to misrepresent an improvement to decades old internet infrastructure.”

“Despite claims to the contrary, a more private DNS would not prevent the use of content filtering or parental controls in the UK. DNS-over-HTTPS (DoH) would offer real security benefits to UK citizens. Our goal is to build a more secure internet, and we continue to have a serious, constructive conversation with credible stakeholders in the UK about how to do that,” he said.

“We have no current plans to enable DNS-over-HTTPS by default in the U.K. However, we are currently exploring potential DNS-over-HTTPS partners in Europe to bring this important security feature to other Europeans more broadly,” he added.

Mozilla isn’t the first to roll out DNS-over-HTTPS. Last year Cloudflare released a mobile version of its 1.1.1.1 privacy-focused DNS service to include DNS-over-HTTPS. Months earlier, Google-owned Jigsaw released its censorship-busting app Infra, which aimed to prevent DNS manipulation.

Mozilla has yet to set a date for the full release of DNS-over-HTTPS in Firefox.

***

https://www.heise.de/select/ct/2018/14/1530492966691096
Seit Firefox 60 beherrscht der Mozilla-Browser DNS over HTTPS.
Wenige Handgriffe schalten es ein.

 

***

How To Enable DNS-over-HTTPs on Firefox

Traditionally, DNS queries and responses are sent over the internet without encryption. This could very well lead to tracking and spoofing vulnerabilities that put users data at risk.

There are many servers in between your computer and DNS server. Information travels through these servers, called on-path routers, can be tracked and used to create a profile of you with a record of all the websites that you look up. And that data is valuable and can be sold to other companies with a lot of money.

What’s worse than tracking is spoofing. If any of these servers acts as a bad man in the middle, they can spoof you a wrong address for a site that could potentially steal your credentials instead.

So, what’s the solution?

For starters, make sure you are using a very good and reliable DNS server as the resolver. For example, Google’s Public DNS and Cloudflare’s extremely fast and privacy-minded 1.1.1.1.

But that wouldn’t solve the issue of being tracked and potentially spoofed. You need to encrypt the data before handing them over to the DNS server. The answer to that is DNS-over-HTTPS.

However, no browsers supported this new protocol just yet but they are coming. For example, Mozilla has started to experimenting feature in its Firefox browser.

Manually configure DoH on Firefox

  1. Type about:config in the address bar in Firefox and press Enter.
  2. Type “network.trr” in the search box to narrow down the items.
  3. Change network.trr.mode to 2, and enter the DoH URL into network.trr.uri

There are two DoH compliant endpoints that are available now to use.

Photo credits to Mozilla

Agentur für Cyberwaffen (ADIC): SPD bremst von der Leyens Prestigeprojekt aus

Von der Leyens Berateraffäre
“Versagen befördert die Karriere”

Von der Leyen will nach Brüssel, doch ihre Vergangenheit könnte sie einholen. Viele Mitglieder im Untersuchungsausschuss zur Berateraffäre beharren auf einer Aussage: “Schließlich soll die europäische Öffentlichkeit erleben, wen die Kanzlerin nach Brüssel geschickt hat.”


Stimmt das EU-Parlament zu, könnte von der Leyen neue Kommissionspräsidentin werden.
(Foto: dpa)

Als erste Signale in Berlin eintrafen, dass Ursula von der Leyen EU-Kommissionspräsidentin werden soll, herrschte ungläubiges Staunen unter Mitgliedern des Untersuchungsausschusses zur Aufklärung der Berateraffäre. Zwar gab es Gerüchte, die Verteidigungsministerin könnte nach Brüssel gehen. Aber dann gleich der EU-Spitzenposten? Das kam dann doch überraschend. Aus Sicht von Matthias Höhn, der für die Linke in dem Ausschuss sitzt, steht fest: “Nach Frau Suder wird nun auch für die oberste Dienstherrin der Absprung vorbereitet. Der Bürger sieht und staunt: Versagen befördert in der Politik die Karriere, statt sie zu beenden.” Nun zeige sich, am Ende komme es auf das persönliche Netzwerk an.

Höhn verweist damit einerseits auf das Vertrauensverhältnis zwischen Kanzlerin Angela Merkel und von der Leyen sowie andererseits auf die zumindest beruflich enge Bindung der Ministerin und ihrer ehemaligen Staatsekretärin Katrin Suder. Die Christdemokratin hatte Suder von McKinsey in ihren Stab geholt. Beide machten sich daran, die Rüstungsbeschaffung zu reformieren – durchaus mit ersten Erfolgen, wie es selbst in der Opposition heißt. Suder verließ das Ressort im April 2018 “auf eigenen Wunsch”. Seit vergangenem Sommer leitet sie den Digitalrat der Bundesregierung. Im September poppte die Berateraffäre auf und ramponierte den bis dahin tadellosen Ruf Suders. Ob sie und von der Leyen individuelle, gar strafrechtlich relevante Schuld jenseits politischer Verantwortung trifft, konnte der Ausschuss bisher nicht ermitteln.

Merkel habe den Abgang von der Leyens geschickt eingefädelt, sagt der FDP-Abgeordnete Alexander Müller. “Aus Sicht der Kanzlerin ergibt das Vorgehen Sinn: Sie ist eine Ministerin los, die innenpolitisch Probleme macht, und kann sich feiern lassen, eine Frau zur Kommissionspräsidentin zu machen.” Zu seinem Demokratieverständnis passe das schon deshalb nicht, da von der Leyen in mehrere Skandale verstrickt sei.

Von der Leyen und Suder sollten nach bisheriger Planung im Dezember vor dem Untersuchungsgremium als Zeugen vernommen werden. “Jetzt stellt sich die Frage, ob wir den Ausschuss straffen und einige Zeugen nicht mehr anhören, um schneller fertig zu werden”, sagt Müller. Was bisher ermittelt worden sei, reiche, um sich ein “klares Bild” zu machen. Von der Leyen werde definitiv vernommen, auch wenn sie nach Brüssel gehe. “Möglich jedoch ist, dass wir sie nicht erst wie geplant am 12. Dezember hören, sondern früher. Schließlich soll die europäische Öffentlichkeit erleben, wen die Kanzlerin nach Brüssel geschickt hat.”

“Es geht um die Aufklärung eines Skandals”

“Gerade habe ich keine Verkürzungstendenzen”, sagt Grünen-Politiker Tobias Lindner, was also heißt, dass er noch darüber nachdenkt. “Welchen Zeugen sollten wir weglassen?” Es dürfe nicht der Eindruck entstehen, dass allein Ziel gewesen sei, die Ministerin zur Verantwortung zu ziehen oder ihr politisch zu schaden. “Es geht um viel mehr, nämlich die Aufklärung eines Skandals und herauszufinden, was geschehen muss, um die Vergabe von Aufträgen zu verbessern.” Die Entscheidung hänge auch vom Verhalten des Nachfolgers von der Leyens ab, wie stark sein Willen und Mut seien, die Affäre aufzuklären sowie Beamte für etwaiges Versagen disziplinarisch zu bestrafen.

Die Oppositionspolitiker erwarten im Falle eines Wechsels von der Leyens nach Brüssel internationale Aufmerksamkeit für die Berateraffäre. “Die Bedeutung der Untersuchung wird noch steigen, wenn die zwielichtigen Vorgänge in einem Ministerium eine amtierende EU-Kommissionspräsidentin politisch zu verantworten hat”, sagt Lindner. Das mediale Interesse dürfte deutlich zunehmen, auch über die Grenzen Deutschlands hinaus, zumal von der Leyen vor dem Ausschuss erklären müsse, “was sie wusste und was nicht”. Linder fragt süffisant: “Wann hat man schon mal eine EU-Spitzenkraft auf dem Zeugenstuhl?”

Der AfD-Bundestagsabgeordnete Rüdiger Lucassen erwartet das Gegenteil. Die Zeugenvernehmungen hätten ergeben, dass von der Leyen und Suder verantwortlich für die Affäre seien. Mit ihrer “Wegbeförderung” könne sich die Ministerin vermutlich der “finalen Konfrontation mit ihrem Versagen entziehen”, sagt er. “Damit verliert der Untersuchungsausschuss an Kraft und Bedeutung.” Zurück bleibe der “schlechte Geschmack” von einer Kultur gegenseitiger Begünstigung im Verteidigungsministerium bei der Vergabe millionenschwerer Aufträge durch ein “Buddy-System”, an dem Spitzenbeamte und Generäle beteiligt gewesen seien. “Ihre unmittelbare Verantwortung für dieses Politikversagen in der deutschen Regierung wird ihr als Menetekel nach Brüssel vorauseilen.”

“Die Steuerzahler dürfte es schaudern”

Unter Verantwortung von der Leyens seien Millionen Steuergelder für dubiose Beraterverträge und die Sanierung der “Gorch Fock” verpulvert worden, meint auch Linke-Politiker Höhn. Jetzt stünden in der EU noch gewaltigere Investitionssummen zur Aufrüstung der Nato an. “Die Steuerzahler dürfte es schaudern.”

Was sagen CDU und CSU zu all dem? Zwei Tage wartete n-tv.de auf eine Stellungnahme – vergeblich. Das Büro des verteidigungspolitischen Sprechers Henning Otte teilt lediglich mit, dieser könne “zeitnah” kein Statement abgeben.

Dennis Rohde vom Koalitionspartner SPD spricht der Ministerin die Qualifikation für den EU-Posten ab. “Ich halte Frau von der Leyen für ungeeignet”, sagt das Ausschussmitglied. Sie hinterlasse das Ministerium “in einem sehr fragwürdigen Zustand”. Eine zeitliche Verkürzung der Untersuchung lehnt er ab. Der SPD gehe es – “im Gegensatz zur FDP” – nicht allein um bestimmte Personen, sondern um Strukturen und die Frage, was verbessert werden müsse. Sein Aufklärungswillen sei ungebrochen. “Frau von der Leyen wird dann eben nicht als Ministerin, sondern als Kommissionspräsidentin vorgeladen”, sagt Rohde.

from: https://www.n-tv.de/politik/Versagen-befoerdert-die-Karriere-article21128226.html

***

Von der Leyens unfertige Cyberagentur

Von: Doris Pundy | EURACTIV.de Jul 3, 2019

Sprachen: English

Ursula von der Leyens prestigeträchtige „Agentur für Innovation in der Cybersicherheit“ strauchelt. Die Finanzierung fehlt, der Bundesrechnungshof kritisiert. Sollte von der Leyen EU-Kommissionspräsidentin werden, wird sich ihre Nachfolgerin oder ihr Nachfolger darum kümmern müssen.

Während Ursula von der Leyen am Mittwoch, 3. Juli, in Straßburg Europaabgeordnete trifft, verpasste sie in Deutschland einen anderen Termin. Zusammen mit dem deutschen Innenminister Horst Seehofer hätte sie in Leipzig eine Absichtserklärung zur Gründung einer Cyberagentur unterzeichnen sollen. Der Termin ging auch ohne von der Leyen über die Bühne. Unterschrieben haben schließlich Seehofer sowie die Ministerpräsidenten Reiner Haseloff aus Sachsen-Anhalt und Michael Kretschmer aus Sachsen, alle Unionspolitiker.

Die Gründung der sogenannten „Agentur für Innovation in der Cybersicherheit“ wurde von der Bundesregierung bereits im Sommer 2018 beschlossen. Leiten werden die Cyberagentur das Innen- sowie das Verteidigungsministerium. Ziel der Agentur ist es, die digitale Infrastruktur Deutschlands vor Hackerangriffen zu schützen durch die „Sicherstellung technologischer Innovationsführerschaft“ gelingen, so die Homepage der Bundesregierung.

Dazu sollen Innovationen aus dem Privatsektor gefördert werden, die für die innere und äußere Sicherheit relevant sind. Vor allem jene, „die sich durch radikale technologische Neuheit auszeichnen und dadurch marktverändernde Wirkung haben können.“ Das Vorbild der neuen Cyberagentur kommt aus den USA: Die Forschungsbehörde DARPA des US-Verteidigungsministeriums, die es bereits seit den 1950er Jahren gibt. Sie legte die technologischen Grundsteine für das Internet und das Satellitennavigationssystem GPS.

Die deutsche Agentur steckt seit ihrem Beschluss in Schwierigkeiten. Laut offiziellem Zeitplan hätte sie bereits im Frühjahr gegründet werden sollen. Erste Ideenwettbewerbe und Forschungsaufträge sind noch für 2019 geplant. Am Mittwoch wurde nun der Stadtort der zukünftigen Agentur festgelegt: der Flughafen Leipzig/Halle in Sachsen. Bis zu 100 Arbeitsplätze sind hier vorgesehen. Wann es soweit sein wird, ist offen. Die Finanzierung der Agentur steht noch aus.

152,5 Millionen Euro bis zum Jahr 2022 soll die Cyberagentur kosten, die als GmbH gegründet werden soll. Die SPD wehrt sich aber laut SPIEGEL-Informationen gegen eine private Rechtsform ohne parlamentarische Kontrolle. Bei der letzten Sitzung des Haushaltsausschusses des Bundestages vor der Sommerpause wurde die Finanzierung der Cyberagentur von der Beschlussliste genommen.

aus: BERLINER MORGENPOST | SONNABEND, 6. JULI 2019

Der Koalitionspartner ist nicht der einzige Kritiker des Projekts. Der Bundesrechnungshof beanstandete die geplante Agentur massiv, wie netzpolitik.org berichtet. Einerseits weist der Rechnungshof auf das Risiko von staatlichen Mehrfachförderungen hin, da es bereits vergleichbare Einrichtungen gäbe. Beispielsweise hat die Bundeswehr schon einen Cyber Innovation Hub für „disruptive Innovationen und digitale Transformation“. Bundesweit gibt es laut dem Bericht des Bundesrechnungshofs noch fünf weitere ähnliche Organisationen.

Andererseits zweifelt der Bundesrechnungshof, ob die Cyberagentur genügend qualifiziertes Personal finden wird, so netzpolitik.org. Die Nachfrage an IT-Spezialisten ist hoch und die Gehälter in Bundesbetrieben sind oft geringer als in der freien Wirtschaft. Der Bundesrechnungshof nennt die Personalpläne der Cyberagentur „ambitioniert“.

Das Verteidigungsminister hält trotz der Kritik an der Agentur fest und hofft auf einen Beschluss im Haushaltsausschuss nach der Sommerpause. Dann könnte die Agentur noch dieses Jahr gegründet werden, so ein Sprecher gegenüber Euractiv.

from: https://www.euractiv.de/section/digitale-agenda/news/von-der-leyens-unfertige-cyberagentur/

 

ADIC “Cyber Agentur” Standort: Flughafen Leipzig/Halle (LEJ), Schkeuditz

Militärische Nutzung: Im Rahmen des Projekts SALIS wird der Flughafen Leipzig/Halle seit dem 23. März 2006 von der NATO zur Realisierung des schnellen Transports übergroßer Ladung als Heimatflughafen zweier russischer Transportflugzeuge vom Typ Antonow An-124 genutzt. Zwei weitere Maschinen stehen innerhalb von sechs und nochmals zwei Maschinen innerhalb von neun Tagen bereit, so dass insgesamt sechs Maschinen im Rahmen von SALIS verfügbar sind. Diese sechs An-124 stehen bereit, um militärisch strategische Lufttransportkapazitäten für die Streitkräfte zur Verfügung zu stellen, aber auch für humanitäre Einsätze und Hilfsaktionen wie z. B. die Erdbebenhilfe für China im Juni 2008. Der Flughafen Leipzig/Halle als Be-, Ent- und Umladeplatz wird laut dem Bundesverteidigungsministerium die Ausnahme sein.

Die Ruslan SALIS GmbH besitzt seit dem 17. Januar 2007 ein Wartungsgebäude im Südbereich des Flughafens für die dort stationierten Maschinen.

Seit dem 23. Mai 2006 lässt die US Army auch über den Flughafen Leipzig/Halle Passagierflüge für den regelmäßigen Truppenaustausch im Irak und in Afghanistan durchführen. Pro Monat wurden im nicht öffentlich zugänglichen Terminal A bis zu 80 Truppentransportflüge mit ungefähr 1600 Soldaten pro Tag abgefertigt. Bis Anfang 2009 sollen bereits 450.000 Soldaten über Leipzig/Halle in den Kampfeinsatz geflogen sein. Im Jahre 2009 soll jeder vierte Passagier ein US-Soldat gewesen sein. Die beauftragten Charterfluggesellschaften Miami Air International und Omni Air International (bis Anfang 2008 ATA Airlines, bis 2013 Ryan International Airlines und bis 2014 World Airways und North American Airlines), lassen die Maschinen auf dem Flughafen Leipzig/Halle betanken und führen Besatzungswechsel durch. Durch den Abzug der US-amerikanischen Truppen aus Krisengebieten im Nahen Osten ist die Zahl von Transitfluggästen jedoch seit 2010 rückläufig.

from: Wikipedia https://de.wikipedia.org/wiki/Flughafen_Leipzig/Halle

***

Agentur für Cyberwaffen

SPD bremst von der Leyens Prestigeprojekt aus

Mit einer modernen Agentur für Cybertechnologien will Ursula von der Leyen ihre Truppe in die digitale Zukunft führen. Die SPD aber verweigerte ihr nach SPIEGEL-Informationen in letzter Minute die Finanzierung.

https://www.spiegel.de/plus/berateraffaere-wie-accenture-an-millionenauftraege-kam-a-00000000-0002-0001-0000-000162036087

Donnerstag, 27.06.2019

Die SPD hat nach SPIEGEL-Informationen die Pläne des Verteidigungsministeriums für eine Agentur zur Entwicklung von Cyberwaffen für die Truppe vorerst gestoppt. Gemeinsam mit dem Innenressort will Verteidigungsministerin Ursula von der Leyen (CDU) eine Inhouse-Gesellschaft gründen, die Digitaltechnologien der freien Wirtschaft auf ihr militärisches Potenzial abklopfen und geeignete Projekte für die Cybereinheiten der Bundeswehr auswählen und finanzieren soll.

Die Agentur soll ähnlich wie die US-Militärforschungsbehörde “Darpa” [Etat ca $3,5 Mrd] die Entwicklung sogenannter disruptiver Cyberwaffen fördern. Dafür hatte das Wehrressort dem Haushaltsausschuss in dieser Woche ein Budget von 152,5 Millionen Euro für die Jahre bis 2022 vorgelegt [etwa €50 Mio pro Jahr].

Die Vorlage aber wurde von der SPD in letzter Minute von der Beschlussliste genommen. Zwar lehnt man die im Koalitionsvertrag vereinbarte Agentur nicht grundsätzlich ab. Der Koalitionspartner moniert allerdings, dass die Agentur als GmbH gegründet werden soll – deshalb verweigerte die SPD die Zustimmung.

“Eine private Rechtsform ohne parlamentarische Kontrolle lehnen wir aus grundsätzlichen Erwägungen und erst recht nach den Erfahrungen mit privaten Beraterfirmen im Verteidigungsbereich ab”, sagte der SPD-Verteidigungsexperte Fritz Felgentreu dem SPIEGEL. Bei der Berateraffäre waren bei mehreren ausgegliederten Inhouse-Gesellschaften der Bundeswehr, allen voran beim IT-Dienstleister BWI, Unregelmäßigkeiten bei Auftrags- und Budgetvergabe aufgedeckt worden.

Die SPD verlangt, dass die Struktur der Cyber-Agentur noch einmal überarbeitet wird. Damit dürfte sich das Prestigeprojekt der Ministerin erheblich verzögern. Die Agentur soll im Raum Leipzig angesiedelt werden.

Das Ministerium bezeichnete die Situation auf Nachfrage als bedauerlich. Man wolle aber nach einem Kompromiss suchen, der dem Parlament mehr Mitbestimmung ermögliche, hieß es.

from: https://www.spiegel.de/politik/deutschland/spd-bremst-ursula-von-der-leyens-agentur-fuer-cyberwaffen-aus-a-1274702.html

***

Berateraffäre im Verteidigungsministerium

Leitender Beamter wollte belastende Akten vernichten

In der Berateraffäre muss Ursula von der Leyens Ministerium neue Unregelmäßigkeiten einräumen. Nach SPIEGEL-Informationen versuchte ein leitender Beamter, Akten mit frisierten Beraterabrechnungen zu entfernen.

Donnerstag, 09.05.2019

Die Berateraffäre im Verteidigungsministerium ist um ein bemerkenswertes Kapitel reicher. Nach SPIEGEL-Informationen hat ein leitender Beamter jetzt den Versuch eingeräumt, Belastungsmaterial gegen ihn zu vernichten. Dabei geht es um falsche und absichtlich frisierte Abrechnungen mit mehreren Beratungsunternehmen, die der Regierungsdirektor für seine damalige Abteilung als sachlich richtig abgesegnet hatte.

Am Donnerstag informierte der Leiter der Rechtsabteilung die Mitglieder des Untersuchungsausschusses über den heiklen Vorgang. Demnach ist gegen den Regierungsdirektor, der früher in der Abteilung Cyber- und Informationstechnik (CIT) eingesetzt war, ein formales Disziplinarverfahren eingeleitet worden. Es bestehe der Verdacht, dass er im Februar 2019 die Akten vernichten wollte, um Unregelmäßigkeiten bei mehreren Abrechnungen mit großen Beratungskonzernen zu vertuschen.

Nach SPIEGEL-Informationen zeichnete der Beamte Detlef S. über die Jahre immer wieder Rechnungen für Leistungen ab, die bereits vor dem Vertragsschluss zwischen der IT-Abteilung des Ministeriums und drei großen Beratungsunternehmen geleistet wurden.

Chaotische Zustände bei der Beraterbeauftragung

Die Ermittlungen betreffen genau die Abteilung im Ministerium, durch die die Berateraffäre ausgelöst wurde. So hatte der Bundesrechnungshof im Sommer 2018 nach jahrelanger Recherche herausgefunden, dass in der Abteilung CIT in großem Umfang Beraterverträge ohne Ausschreibung aus Rahmenverträgen des Bundes vergeben worden waren, die gar nicht für die IT-Projekte angelegt sind. Mittlerweile hat Ursula von der Leyen die Vorwürfe weitgehend eingeräumt.

Der Fall des Regierungsdirektors wirft nun ein Licht auf die chaotischen Zustände bei der Beauftragung der Berater. So ergibt sich aus den Akten, die der Beamte vernichten wollte, dass er und die Berater vereinbart hatten, mehrere Projekte schon vor der eigentlichen Vergabe zu beginnen. Die entsprechenden Rechnungen wurden dann nachdatiert. Offenbar wollte S. mit dem Löschversuch vertuschen, dass er die frisierten Rechnungen als “sachlich richtig” abgezeichnet hat.

Die Abmachung zeigt, wie eng und abseits aller Vorschriften Berater und von der Leyens Beamte in der Abteilung CIT kooperierten und dabei alle Verwaltungsvorschriften brachen. Im besten Fall wollten sie damit die dringliche IT-Projekte voranbringen. Im Raum steht aber auch die Frage, ob bestimmte Beamte durch die Regelbrüche befreundete Berater mit Aufträgen versorgen wollten. Bisher soll es darauf im Fall S. keine Hinweise geben, doch die Ermittlungen gehen weiter.

Noch einiges aufzuklären

Die Löschaktion des Beamten S. wird für ihn persönlich zum Problem. Technikern fiel schnell sein Versuch auf, die falschen Abrechnungen aus den Datenbanken zu löschen. Als alle Beamten mit einer Zugriffsberechtigung auf den Server befragt wurden, räumte er die Aktion ein. Fachleute konnten die entsprechenden Ordner wiederherstellen, erst dann fiel auf, dass die Abrechnungen nicht stimmten. Ohne die Löschaktion wäre dies offenbar gar nicht aufgefallen.

Für den Untersuchungsausschuss, der am Donnerstag mehrere Zeugen hören wird, dürfte der Fall bestätigen, dass es in der Affäre noch einiges aufzuklären gilt. Auch bei anderen Projekten existieren Hinweise, dass sich externe Berater mehr oder weniger selbst Aufträge für die Bundeswehr schrieben und bestimmte Beamte dies regelmäßig abnickten. Die Vorgänge rund um die Abteilung CIT will der Ausschuss nach der Sommerpause untersuchen. S. könnte dabei als neuer Zeuge geladen werden.

Für die Grünen ist der Fall S. ein weiterer Beleg für das Chaos im Wehrressort. “Anscheinend war das Ministerium für manche Berater wie ein Selbstbedienungsladen”, sagte Tobias Lindner nach der Unterrichtung über den Vorgang. Für ihn ist nicht nur der Nutzen der millionenschweren Beraterverträge fraglich. “Mittlerweile müssen wir uns fragen, ob nicht zum Nachteil des Steuerzahlers agiert wird”, so Lindner.

from: https://www.spiegel.de/politik/deutschland/ursula-von-der-leyens-berateraffaere-beamter-versucht-beweise-zu-vernichten-a-1266632.html

***

Cybersicherheit: Frankreich und Deutschland warnen vor „Schritt zurück“

Apr 25, 2018

Neue EU-Gesetzesvorschläge könnten zum Problem für Mitgliedstaaten wie Frankreich und Deutschland werden, die bereits weitergehende Cybersicherheits-Regularien haben, warnte der Chef der französischen Cybersicherheitsagentur im Gespräch mit EURACTIV.

Frankreich und Deutschland könnten gezwungen werden, „einen Schritt zurück“ zu machen, wenn der Vorschlag für ein EU-Cybersicherheitsgesetz in seiner aktuellen Form angenommen wird, glaubt Guillaume Poupard, Direktor der französischen Sicherheitsagentur ANSSI.

Frankreich und Deutschland sind die schärfsten Kritiker des Kommissionsvorschlags vom vergangenen Jahr.

Die beiden Länder sind besonders misstrauisch gegenüber einer Maßnahme, die die Schaffung eines Systems zur Zertifizierung des Cybersicherheitsniveaus von Technologieprodukten vorschlägt. Der Vorschlag der Kommission würde der in Athen ansässigen EU-Agentur ENISA neue Befugnisse zur Überwachung der Zertifizierungsstufen in diesem System übertragen.

Poupard und sein Gegenpart Arne Schönbohm vom deutschen Bundesamt für Sicherheit in der Informationstechnik (BSI) fordern, die ENISA solle sich eher zurückhalten, da die Agentur weder ausreichend Erfahrung noch Belegschaft habe, um diese neue Rolle zu übernehmen. Stattdessen sollten die einzelnen Mitgliedstaaten die Diskussionen führen.

ANSSI und BSI sind zwei der größten europäischen Agenturen für Cybersicherheit, die insbesondere auf Hackerangriffe gegen Unternehmen oder Regierungsbehörden reagieren.

ENISA: Ein Zwerg

Die Europäische Kommission will der ENISA mehr Geld und Mitarbeiter zur Verfügung stellen – aber selbst mit dieser Steigerung würde sie von den größeren französischen und deutschen Agenturen weiterhin in den Schatten gestellt.

„Wir wissen wie man’s macht. Wir haben 20 Jahre Erfahrung,“ unterstrich Poupard mit Blick auf Frankreichs eigenes Zertifizierungssystem. Er fügte hinzu: „Wir wollen ein System auf europäischer Ebene, aber dieses System sollte nicht die Staaten als Hauptakteure ablösen.“

Wenn die Handhabung des neuen Systems durch die ENISA scheitert, bedeute dies, dass langwierige Gesetzgebungsgespräche absolut nutzlos waren und potenziell nicht ausreichende Zertifizierungskriterien die Cybersicherheitsbranche in allen EU-Mitgliedstaaten schwächen könnten – nicht nur in Frankreich und Deutschland, so Poupard.

„Europa verliert hier sehr wichtige Zeit: Alles geht sehr schnell, die Hacker sind sehr effizient. Wenn wir jetzt fünf Jahre vergeuden, ist das eine halbe Ewigkeit,“ kritisierte er.

Der Vorschlag der Kommission, der im vergangenen September vorgelegt wurde, war bereits in Planung, bevor die massiven Hackerangriffe WannaCry und NotPetya im Mai/Juni 2017 Unternehmen in ganz Europa lahmlegten. Die EU-Exekutive hat diese Vorfälle später dennoch als Grund genannt, warum neue Gesetze durchgesetzt werden müssten.

Laut Gesetzesentwurf müsste die ENISA Unternehmen und Mitgliedstaaten konsultieren, bevor sie Kriterien für das Zertifizierungssystem festlegt. Nachdem die ENISA verschiedene Sicherheitsstufen genehmigt hätte, müssten dann die Diplomaten der Nationalstaaten in einem beschleunigten Gesetzgebungsverfahren, einem so genannten Durchführungsrechtsakt, darüber abstimmen.

„Wir sind überzeugt, dass es schiefgeht“

Doch aus Sicht der Chefs der deutschen und französischen Cybersicherheitsagenturen würde auch diese Regelung der ENISA zu viel Einfluss geben.

Poupard sagte deutlich: „Wir sind überzeugt, dass es schiefgeht.“ Der Franzose weiter: „Wir sind nicht zufrieden mit dem, was vorgeschlagen wurde. Allein schon, weil nicht alle Fragen beantwortet werden und das System sehr viel weniger effizient wäre.“

Die Kommission hingegen argumentiert, der Druck Frankreichs und Deutschlands gegen die Aufsicht der ENISA über das neue System würde den Übergang zu einer EU-weit gültigen Cybersicherheitszertifizierung nur verlangsamen.

Paris und Berlin fordern, dass die Mitgliedstaaten eine zusätzliche Kontrolle haben, um Zertifikate zu genehmigen, bevor die ENISA sie durchsetzt.

Despina Spanou, eine der führenden Beamtinnen der Kommission, die für die Gesetzgebung zuständig ist, sagte am Montag im Industrieausschuss des Europäischen Parlaments (ITRE), auch diese Art von Kontrolle könnte zeitliche Probleme mit sich bringen.

Sie warnte, eine stärkere Beteiligung der Mitgliedstaaten könnte die Genehmigung der Zertifizierungskriterien um „ein weiteres Jahr hinauszögern, was die angedachten Programme weiter verzögern würde.“

Zertifizierung

Im Europäischen Parlament konzentrierten sich die Debatten der Abgeordneten über den Gesetzentwurf darauf, ob Unternehmen verpflichtet werden sollten, ihre Produkte zu zertifizieren, bevor sie in den Verkauf gehen dürfen. Der Vorschlag der Kommission sieht nur eine freiwillige Zertifizierung vor.

Poupard hingegen fordert, die Gesetzgebung müsse eine Zertifizierung für Produkte vorschreiben, die ernsthafte Sicherheitsrisiken darstellen können – wie digitale Gesundheitstechnologien und mit dem Internet verbundene Autos.

Aber er sagte auch, er stimme ansonsten dem Hauptgedanken des Kommissionsvorschlags zu: Die Cybersicherheitszertifizierung solle EU-weit erfolgen, um den Unternehmen weitere Kosten zu ersparen. Im Vorschlag der Kommission wurden ebenfalls die teuren Antragsverfahren für die Zertifizierung in einigen Ländern als ein Grund für die neuen, EU-weit gültigen Rechtsvorschriften genannt.

„Wir brauchen etwas wirklich Europaweites,“ glaubt auch Poupard.

Er betonte, das System müsse mehr sein als lediglich eine Vereinbarung zwischen den Mitgliedstaaten zur Anerkennung nationaler Zertifizierungen des jeweils anderen. Stattdessen müssten die nationalen Agenturen wirklich identische Kriterien für die Festlegung der Sicherheitsstufen verwenden. Es wäre „ein echter Alptraum“, wenn einige Länder schwache Schutzmaßnahmen für Produkte genehmigen würden, bevor sie dann in anderen Mitgliedstaaten verkauft werden dürfen, warnte Poupard.

ANSSI beschäftigt sich seit kurzem mit neuen Technologien – im Einklang mit der politischen Agenda von Präsident Emmanuel Macron, der Feldern wie der künstlichen Intelligenz immer mehr Bedeutung einräumt. Im vergangenen Monat kündigte Macron an, Frankreich werde bis 2022 rund 1,5 Milliarden Euro an öffentlichen Forschungsgeldern für neue Technologien bereitstellen.

Frankreich plant Entwicklung einer sicheren Messaging-App

Poupard ist Mitglied von JEDI, der Joint European Disruption Initiative, einer weiteren Initiative von Macron, mit der französische und deutsche Experten bei der Entwicklung modernster Technik unterstützt werden sollen.

So bezeichnete Poupard einen neuen Plan der französischen Regierung, eine sichere Messaging-App zu entwickeln, als ein „wichtiges Upgrade“, da amerikanische und asiatische Dienste wahrscheinlich Daten außerhalb Europas speichern.

Letzte Woche teilte das französische Digitalministerium mit, es entwickele eine sichere Messaging-App für Regierungsangestellte, die anstelle von WhatsApp, das Facebook gehört, verwendet werden soll.

Es gebe inzwischen mehr öffentliches Interesse an sicheren Technologien, nachdem im vergangenen Monat bekannt wurde, dass mehr als 87 Millionen Facebook-Nutzerdaten ohne Wissen der User von der Politikberatung Cambridge Analytica verarbeitet worden waren, so Poupard. Der Skandal habe „den Menschen mehr und mehr die Notwendigkeit aufgezeigt, ihre Daten zu schützen“.

„Der Ort, wo es Sinn macht, Daten zu speichern, wo es ökonomisch, ethisch und rechtlich sinnvoll ist, ist eindeutig Europa,“ unterstrich er.

Das entspricht auch den Zielen von JEDI und der französischen Regierung sowie den Plänen der Europäischen Kommission, in einheimische Technologien zu investieren, die mit den amerikanischen Technologieriesen konkurrieren können.

„Wir müssen in Europa vor allem unsere digitale Autonomie weiterentwickeln“, forderte Poupard.

from: https://www.euractiv.de/section/digitale-agenda/news/cybersicherheit-frankreich-und-deutschland-warnen-vor-schritt-zurueck/

 

 

 

Bitcoin verbraucht mehr Strom als die Schweiz

[just quoting the statistics … the power use discussion is more complex]

Der Bitcoin verbraucht laut Cambridge Bitcoin Electricity Consumption Index aktuell rund 60 Terrawattstunden Strom pro Jahr. Das ist mehr als beispielsweise die Schweiz oder Irland.

Von 219 der im CIA Factbook gelisteten Länder und Gebiete verbrauchen nur 42 mehr als die Digitalwährung.

Deutschland liegt in diesem Ranking mit 537 Terrawattstunden auf Platz sechs hinter China, den USA, Indien, Japan und Russland. Das alles sind wohlgemerkt Schätzwerte.

Im Fall des Bitcoins besteht hinsichtlich des tatsächlichen Energiebedarfs eine große Unsicherheit – die Untergrenze setzen die Analysten derzeit bei 23, die Obergrenze bei 183 Terrawattstunden.

 

from: https://de.statista.com/infografik/18608/stromverbrauch-ausgewaehlter-laender-im-vergleich-mit-dem-des-bitcoins/

 

 

‘Jetson’ is a Pentagon laser that can identify people by heartbeat

Lookin’ for a heartbeat…

Forward-looking: Biometrics is advancing and evolving at a rapid rate. It seems like just yesterday we were unlocking our phones with our fingerprint — now it’s our face. What’s it going to be tomorrow — our heartbeat? Well, maybe.

MIT Technology Review reports that the Pentagon now has a prototype infrared laser that can identify people by their heartbeat. It is called “Jetson” and uses laser vibrometry to detect movements on the surface of the skin caused by a person’s pulse. It even works from as far as 200 meters away.

If you grew up reading Daredevil comic books, you already know that everyone’s heartbeat is unique, which is how the super-powered blind Matt Murdock was able to identify people. Jetson works similarly.

By detecting a person’s heartbeat, then comparing it to a database, the laser can ID someone with 95-percent accuracy in optimal testing conditions. Of course, the most significant advantage is that it cannot be fouled up like facial recognition and fingerprint sensors since a heart rhythm can’t be duplicated or changed.

“Compared with face, cardiac biometrics are more stable and can reach more than 98% accuracy,” Wenyao Xu of the State University of New York at Buffalo, who has also developed a cardiac sensor that uses radar from up to 20 feet away.

“Existing long-range facial recognition [systems] suffer from acquiring enough pixels at a distance to use the face matching algorithms.”

There are some caveats though. While the laser can detect a heartbeat from a distance on bare skin or through thin material like a tee shirt, thicker clothing like a jacket makes it ineffective. The system also needs about 30 seconds to create a good enough profile for analysis. The subject must be still during that time as well.

According to some 2017 documents from the Combating Terrorism Technical Support Office (CTTSO), Jetson has been in development for several years, and it is looking to decrease response time down to under five seconds.

“Existing long range biometric methods that rely on facial recognition suffer from acquiring enough pixels at a distance to use the face matching algorithms and require high performance optics to acquire visual signatures at significant distances,” said the CTTSO. “The Jetson effort being developed by Ideal Innovations, Inc. is a ruggedized biometric system that will capture cardiac signatures to aid in the positive identification of an individual at a distance up to 200 meters and within five seconds.”

The Pentagon is looking at if for military and surveillance applications, but there are several practical and commercial possibilities as well.

As previously mentioned, such technology could be used as a biometric solution for mobile devices. In fact, Apple has been looking into similar technology since at least 2010. It could also be used in medical and clinical situations. Wireless heart monitors are a possibility as are stethoscope-free checkups with your doctor. Badge-less entry systems for secured buildings would be another use case.

 

from: https://www.techspot.com/news/80704-jetson-pentagon-laser-can-identify-people-heartbeat.html

 

Der chinesisch-amerikanische Anteil am weltweiten Handel beträgt nur 3,1 Prozent

Fünf Gründe sprechen dagegen, dass Trump und sein Handelskrieg die Welt erschüttern werden:

Erstens: China und Amerika sind zwar die größten Wirtschaftsmächte der Gegenwart. Aber beide produzieren im Wesentlichen für ihre riesigen Binnenmärkte. Hier gibt es keine Zölle und daher auch keinen Zollkrieg.

Zweitens: Der chinesisch-amerikanische Anteil am weltweiten Handel von Waren und Dienstleistungen beträgt nur 3,1 Prozent (siehe Grafik unten). Damit lassen sich Irritationen auslösen, aber keine Weltwirtschaftskrisen.

Drittens: Der Handel zwischen den USA und der Europäischen Union, der immerhin schon 5,1 Prozent des Welthandels ausmacht, kann jederzeit ausgebaut werden. Er wirkt als Schockabsorber.

Viertens: Die weltweit verlegten Wertschöpfungsketten der großen Konzerne reagieren schnell auf die Verlagerungen von Angebot und Nachfrage. Fällt die Produktion in einem asiatischen Land aus, wird sie in einem anderen hochgefahren. So erlebte Taiwan im ersten Quartal 2019 einen Exportanstieg in die USA von 30 Prozent im Vergleich zum Vorjahr, Südkorea liegt bei 17 Prozent Steigerung seiner US-Exporte. Vietnam konnte um mehr als 20 Prozent zulegen (siehe Grafik unten).

Fünftens: Der Terminus vom amerikanisch-chinesischen Handelskrieg suggeriert, beide Nationen seien in gleicher Weise betroffen. Das ist nicht der Fall. Laut OECD entspricht der Handel mit den USA 3,9 Prozent des chinesischen Bruttosozialprodukts. Der Anteil des amerikanischen Sozialprodukts, der vom China-Handel abhängt, liegt nur bei 1,3 Prozent.

Fazit: Wir erleben nicht das Ende der Globalisierung, sondern lediglich den amerikanischen Versuch, ihre Regeln neu zu interpretieren. Trump möchte die „Terms of Trade“ zu seinen Gunsten verändern. Die Chinesen, die stark von der westlichen Naivität profitiert haben, verlieren nicht ihr Gesicht – nur die Windfall-Profite der letzten Jahre.

 

 

from: Steingarts Morning Briefing <news@morning-briefing.gaborsteingart.com> am 28 JUN 2019

 

***

Something else is much more alarming (besides the climate change):

 

Nach der Weltfinanzkrise haben sich die Staats- und Regierungschefs der wichtigsten Länder geschworen, dass sich die Verschuldung von Banken und Staaten nicht wiederholt. Doch die Wirklichkeit sieht anders aus. Um 72 Prozent ist allein die Staatsverschuldung der G20-Staaten seit 2009 gestiegen, die Wirtschaftsleistung dagegen nur um 31 Prozent. Hohe Schulden und schwaches Wachstum könnten die Zutaten für eine neue Weltfinanzkrise werden. Unsere aktuelle Titelgeschichte „Comeback der Schulden“ zeigt die Risiken dieser gefährlichen Schuldenorgie. Wir hätten auch schreiben können: „Nichts gelernt“.

 

from: Handelsblatt Morning Briefing <morning_briefing@redaktion.handelsblatt.com> am 28 JUN 2019

 

 

Difficulty & Hashrate Records: It’s Now Harder to Mine Bitcoin Than Ever. 70 EH/s is Coming.

Bitcoin mining has become more competitive than ever.

Bitcoin mining difficulty – the measure of how hard it is to earn mining rewards in the world’s largest cryptocurrency by market cap – has reached a new record high above 7.93 trillion. That’s a seven percent jump from the 7.45 trillion record set during the recent two-week adjustment cycle, which was the highest since October 2018.

Bitcoin is designed to adjust its mining difficulty every 2,016 blocks (approximately 14 days), based on the amount of computing power deployed to the network. This is done to ensure the block production interval at the next period will remain constant at around every 10 minutes. When there are fewer machines racing to solve math problems to earn the next payout of newly created bitcoin, difficulty falls; when there are more computers in the game, it rises.

 

Data from BTC.com

 

Right now the machines are humming furiously. Bitcoin miners across the world have been performing calculations at an average 56.77 quintillion hashes per second (EH/s) over the last 14 days to compete for mining rewards on the world’s first blockchain, according to data from mining pool BTC.com.

BTC.com data further indicates the average bitcoin mining hash rate in the last 24-hour and three-day periods were 59.58 EH/s and 59.70 EH/s, respectively, even higher than the average 56.77 EH/s from May 15 to June 27, or any 14-day data in the network’s history.

Similarly, data from blockchain.info also shows the aggregate of bitcoin computing power was around 66 EH/s as of June 22, surpassing last year’s record high of 61.86 EH/s tracked by the site, and has more than doubled since December 2018 when the hash rate dropped to as low as 31 EH/s amid bitcoin’s price fall.

Assuming all such additional computing power has come from more widely used equipment such as the AntMiner S9, which performs calculations at an average rate of 14 tera hashes per second (TH/s), that suggests more than 2 million units of mining equipment may have been switched on over the past several months. (1 EH/s equals to 1 million TH/s)

 

 

The increase in capacity is also in line with bitcoin’s price jump over the first half of 2019, which caused the price of second-hand mining equipment to double in China, and also juiced demand for new machines.

BTC.com further estimates the bitcoin mining difficulty will jump by another seven percent at the beginning of the next adjustment cycle, which would be the first time for bitcoin mining difficulty to cross the eight trillion threshold.

Delayed plugging in

Such computing interest comes at a time when mining farms in China, especially in the country’s mountainous southwest, have been gradually plugging in equipment as the rainy summer approaches.

According to a report published by blockchain research firm Coinshare, as of earlier this month, 50 percent of the global bitcoin computing power was located in China’s Sichuan province.

However, it’s important to note that this year, the arrival of the rainy season in China’s southwest has been delayed by nearly a month compared to previous years. As a result, some local mining farms were only running less than half of their total capacity in the past month.

Xun Zheng, CEO of mining farm operator Hashage based in Chengdu that owns several facilities across China’s southwestern provinces, said there had been no rain in the area for over 20 days since early May, which was “unusual.”

“In the past years, it usually starts raining continuously throughout May so [hydropower plants] normally will have enough water resources by early June,” he said.

As a result, in early June his firm was only operating at 40 percent of capacity; it can host more than 200,000 ASIC miners. But as the rain has arrived gradually over the past two weeks, the proportion has climbed to over 60 percent.

Mining farms in China previously estimated that the total hash rate this year during the peak of the rainy season around August could break the threshold of 70 EH/s. That means another 300,000 units of mining machines could be further activated, assuming all are AntMiner S9s or similar models.

Those waiting to be switched on will also include new capital in the sector such as Shanghai-based Fundamental Labs, a blockchain fund that has invested $44 million on top-of-the-line mining equipment, which will be activated in June.

 

from: https://www.coindesk.com/bitcoin-hash-rate-new-record

 

 

Buried in Facebook’s Libra White Paper, a Digital Identity Bombshell: “I am over 18” credential … and more

The Takeaway

  • Facebook’s Libra white paper includes a brief but potentially seismic nod to digital identity standards.
  • With 2 billion users worldwide, Facebook may be able to succeed where others have failed in jump-starting a globally accepted digital ID.
  • Some identity experts say this is even more important than the cryptocurrency, but others question how much control Libra would give users and find its approach overbearing.
  • see below: The Libra Technical Whitepaper
  • see below: Libra White Paper Shows How Facebook Borrowed From Bitcoin and Ethereum
  • see below: The Libra Move Programming Language
  • see below: A Deep-Dive into Libra Move

Buried in Facebook’s Libra white paper are two short sentences hinting that the project’s ambitions go even further than bringing billions of people into the global financial system.

More than launching a price-stable cryptocurrency for the masses, Libra could be aiming to change the way people trust each other on the internet.

At the top of page nine, in a section describing the consortium that will govern the Libra coin, the white paper states:

“An additional goal of the association is to develop and promote an open identity standard. We believe that decentralized and portable digital identity is a prerequisite to financial inclusion and competition.”

That’s all the paper has to say on the topic of identity, perhaps explaining why the brief mention of such a foundational issue for 21st-century commerce escaped widespread notice despite all the hype over the document itself.

But to some observers, the line dropped like a bomb.

Dave Birch, director of Consult Hyperion and the author of books on digital identity and bitcoin, flagged these lines as “the most interesting” in the paper.

Smoothing pathways on the internet using identity is a bigger deal to many people than a putative cryptocurrency, Birch argued, adding:

“There are no throwaway remarks in a Facebook white paper that has taken a year to put together. It’s in there for a reason. [Facebook] are actually going to try and fix the identity problem.”

A Facebook spokeswoman said this week that the company had nothing to add about identity beyond what’s in the white paper.

Who are you?

It’s a problem almost as old as the internet itself. As the classic “New Yorker” cartoon put it, “on the internet, nobody knows you’re a dog.”

In such an environment, businesses need to guard against fraud, but the copious amounts of personal data consumers must share to prove they are who they say they are leaves them vulnerable to identity theft and spying.

Fixing this problem means finding a way to have the sort of credentials an individual holds in their physical wallet realized in a verifiable digital version which can be trusted across the internet. And for many technologists who have thought long and hard about identity, the solution must be “self-sovereign,” or controlled by the individual.

Birch, who has long seen the potential of social networks as natural springboards for managing digital identity, described a scenario where a user’s “I am over 18” credential (rather than their exact birthdate) is needed to log into a dating site.

This could be accessed through Libra’s cryptocurrency wallet Calibra via one of its partners, Mastercard, for example, with its two-factor authentication process. Then a cryptographic credential is sent back to Calibra containing no personally identifiable information but stating this person is over 18, which can then be presented to the dating site at log in.

While others have proposed similar arrangements (sometimes involving blockchains), none had the reach of Facebook, with its 2.38 billion users worldwide.

If Libra were “to drift in the direction of self-sovereign solutions, Facebook’s endorsement of that approach might make more of an impact on the market than, say, uPort or Evernym might have done,” Birch said, referring to two such blockchain ID startups.

And despite its reputation as the ultimate Peeping Tom, Facebook has hinted at such aspirations before. In February, while Libra was still under wraps, CEO Mark Zuckerberg said he was investigating blockchain’s potential to allow internet users to log in to various services via one set of credentials without relying on third parties.

Standard setting

Stepping back, technologists have been trying to address the challenge of identity for more than a decade by establishing open standards. In the same way that URLs, for example, open webpages anywhere on the internet, standards are also needed to ensure digital attributes about an individual can be universally issued and verified.

The OAuth standard, for example, is what let you log into websites through a third-party service like Facebook without sharing a password. More recently, such work under the auspices of the World Wide Web Consortium (W3C) has included things like Decentralized Identifiers (DIDs) and the verifiable credentials standard, both meant to enable self-sovereign digital identity.

Some veterans of this field were taken aback by the suggestion that the Libra Association (a group of 30 or so companies, expected to reach 100 or more) would develop an open identity standard.

“That’s very world domination-ish of them,” said Kaliya Young, a co-author of “A Comprehensive Guide to Self Sovereign Identity” and co-founder of the Internet Identity Workshop. “Some of us have been working on that problem for a really long time. You already have a set of open standards for verifiable credentials that are basically done and working.”

Young pointed out that “unilaterally declaring” an open standard belies the process of going through standards development with an open community, adding that all the people working on identity standards are connected to one another in reaching a common goal.

“That work is being led by a community of people deeply committed to there being no one company owning it in the end, because identity is too big to be owned, just like the web is too big to be owned,” she said.

(Indeed, Facebook was previously said to have rebuffed an invitation to participate in the DID project alongside Microsoft.)

Phil Windley, chair at the Sovrin Foundation, which contributed the codebase to the Hyperledger Indy blockchain ID project, acknowledged the risk of parsing two sentences in Libra’s paper too finely. But he made the point that “decentralized” and “portable” (Facebook’s words) are not exactly the same as self-sovereign.

“Decentralized” could simply mean a user’s identity data – their attributes and identifiers – are spread among nodes that are run on the Libra blockchain, said Windley. This doesn’t imply the user necessarily has control of them. Likewise, “portable” just means credentials can be moved from one place to another but doesn’t necessarily mean you get a say in how they are used.

Windley told CoinDesk:

“People often use ‘decentralized’ as an unalloyed gilt and just assume that it means everything is going to be great. That could be what they are doing here – just using ‘decentralized’ as a synonym for ‘awesome.’”

Joining the dots

That said, Windley was respectful about the scale of Libra’s vision, which he suspects is much bigger than dealing with know-your-customer (KYC) checks and the regulation around building a global permissioned currency platform.

He pointed to the paper’s authors which include many firms like Mastercard or Kiva, folks who have thought very hard about digital identity. (Neither company would comment on Libra’s approach to digital identity).

“I suspect given Libra’s goal of financial inclusion, they are probably thinking about it bigger than just authentication and authorization for a few narrow purposes,” said Windley. “I think there is enough there (e.g. the smart contract language) to believe a stablecoin is just one thing that they envisage using Libra for.”

In the absence of any detail on what might comprise a decentralized identity standard from Libra’s perspective, some dots can be joined by examining the recent work of George Danezis and his co-founders at Chainspace, a startup acquired by Facebook in May.

A paper introducing a “selective disclosure credential scheme” called Coconut explains how a system of smart contracts (computer programs that run on top of blockchains) could “issue user credentials depending on the state of the blockchain, or attest some claim about a user operating through the contract – such as their identity, attributes, or even the balance of their wallet.”

The Coconut protocol goes on to describe how credentials can be jointly issued in a decentralized manner by a group of “mutually distrusting authorities.” These credentials cannot be forged by users or a group of corrupt authorities, and are also “re-randomized” prior to being presented for verification to further protect user privacy. Unlike some computationally-hungry proving schemes, this is done in a matter of a few milliseconds making it highly scalable.

Returning to the question of standards, Birch said W3C, DIDs and verified credentials might be the right option for Libra, but whether it’s that or something else, basically whatever they choose would end up being a standard, he said, concluding:

And you could argue, is that necessarily a bad thing? I mean what happens if they come up with a good standard for identity and attributes and so on and then other people can use it, e.g. banks would be one obvious example.”

from: https://www.coindesk.com/buried-in-facebooks-cryptocurrency-white-paper-a-digital-identity-bombshell

 

***

Libra White Paper Shows How Facebook Borrowed From Bitcoin and Ethereum

With the long-awaited Libra white paper, Facebook is showing off its blockchain smarts, and making a bid for crypto credibility.

Released Tuesday morning, the 29-page paper describes a protocol designed to evolve as it powers a new global currency. More than a year in the making, the document opens by trumpeting the new blockchain’s ambitious goal:

“The Libra Blockchain is a decentralized, programmable database designed to support a low-volatility cryptocurrency that will have the ability to serve as an efficient medium of exchange for billions of people around the world.”

As a first step toward achieving the “decentralized” part, the protocol has been turned over to a new organization, the Libra Association, whose members will hold separate tokens allowing them on-chain voting rights to govern decisions about Libra.

“Over time, it’s designed to transition the node membership from these founding members who have a stake in the creation of the ecosystem to people who hold Libra and have a stake in the ecosystem as a whole,” Ben Maurer, Facebook’s blockchain technical lead, told CoinDesk in an exclusive interview.

In short, Libra is designed to be a high throughput, global blockchain, one that’s built with programmable money in mind but limits how much users can do initially as it evolves from prototype to a robust ecosystem.

Unlike many other blockchains, Libra seems laser-focused on payments and other financial use cases for consumers.

But the white paper itself seems geared to demonstrate both Facebook’s proposed advances to the science of distributed consensus and its appreciation for what has been built so far.

Indeed, over the last several months, many sources told CoinDesk they had visited Facebook to share their perspective on decentralized technology. The company has done a lot of homework.

And now it has created a new language for writing commands on its blockchain, called Move, and opened its software to public inspection.

“To validate the design of the Libra protocol, we have built an open-source prototype implementation — Libra Core — in anticipation of a global collaborative effort to advance this new ecosystem,” the white paper states.

“It’d be sort of presumptuous for us to say we’re creating an open environment and then say, ‘Well, but we’ve set everything in stone,’” Maurer told CoinDesk. “It’s a paper that requests feedback.”

Mix and match

Libra’s designers have picked what they see as the best features of existing blockchains while providing their own updates and refinements.

1. Like bitcoin, there’s no real identity on the blockchain.

From the perspective of the blockchain itself, you don’t exist. Only public-private key pairs exist. The white paper states: “The Libra protocol does not link accounts to a real-world identity. A user is free to create multiple accounts by generating multiple key-pairs. Accounts controlled by the same user have no inherent link to each other.”

2. Like Hyperledger, it’s permissioned (at least to start).

Initially, the consensus structure for Libra will be dozens of organizations that will run nodes on the network, validating transactions. Each time consensus is voted on for a new set of transactions, a leader will be designated at random to count up the votes.

Libra opts to rely on familiarity rather than democracy to choose the right entities to establish consensus in the early days. “Founding Members are organizations with established reputations, making it unlikely that they would act maliciously,” the white paper states. These are entities range from traditional payment networks (Mastercard, Visa) to internet and gig-economy giants (eBay, Lyft) to blockchain natives (Xapo) to VCs (Andreessen Horowitz, Thrive Capital).

3. Like tezos, it comes with on-chain governance.

The only entities that can vote at the outset are Founding Members. These members hold Libra Investment Tokens that give them voting rights on the network, where they can make decisions about managing the reserve and letting new validators join the network.

The governance structure is built into the Move software from the start, and like Tezos it is subject to revision over time. Updates will be essential as it adds members and evolves from what’s more like a delegated proof-of-stake (DPoS) system (such as EOS or steem) to a fully decentralized proof-of-stake ecosystem.

4. Like ethereum, it makes currency programmable.

In a number of ways, the white paper defines interesting ways in which its users can interact with the core software and data structure. For example, anyone can make a non-voting replica of the blockchain or run various read commands associated with objects (such as smart contracts or a set of wallets) defined on Libra. Crucially, Libra’s designers seem to agree with ethereum’s that running code should have a cost, so all operations require payment of Libra as gas in order to run.

Unlike ethereum, Libra makes two important changes in its smart contracts. First, it limits how much users can do on the protocol at first (the full breadth of Move’s features are not yet open). Second, it breaks data out from software, so one smart contract (what Move refers to as a “module”) can be directed at any pool of assets, which Move calls “resources.” So one set of code can be used on any number of wallets or collections of assets.

5. Also like ethereum, it thinks proof-of-stake is the future, but it is also not ready yet.

“Over time, membership eligibility will shift to become completely open and based only on the member’s holdings of Libra,” the white paper promises, describing a path to real permissionless-ness.

Meanwhile, the paper dismisses the approach of the blockchains with the longest track record (namely bitcoin), stating, “We did not consider proof-of-work based protocols due to their poor performance and high energy (and environmental) costs.”

6. Like Binance’s coin, it does a lot of burning.

Blockchains that build in purposeful burning of tokens became very influential last year. Binance, the world’s leading exchange, created the BNB token, with which users could pay trading fees at a discount. Binance led the way to token bonfires, regularly burning a significant portion of its profits paid in BNB.

Libra won’t use burning to enhance the value of its coin. Rather (as with collateralized stablecoins such as tether), tokens will be issued and burned constantly, as the association responds to demand shifts for its reserve, with no supply maximum or minimum supply.

7. Like coda, users don’t need to hold onto the whole transaction history.

A lesser-known protocol, Coda, was one of the first to make its ledger disposable. Users only need to hold a proof of the last block, which they can easily check on a smartphone to be sure they are interacting with a valid ledger.

Similarly, on Libra, “historical data may grow beyond the amount that can be handled by an individual server. Validators are free to discard historical data not needed to process new transactions.”

8. Like EOS, it hasn’t worked everything out yet.

EOS launched without its approach to governance well defined, which yielded complications down the road. Similarly, Libra promises to decentralize, but there’s nothing that inherently forces its members to do so.

Work in progress

Other matters are left undecided as well. For example, the storage of data.

“We anticipate that as the system is used, eventually storage growth associated with accounts may become a problem,” the white paper says. The document anticipates but does not define a system of rent for data storage.

It cites a number of examples of other open questions, such as how best to maintain security as more validators join the network, how often the pool of validators can change and how modules can be updated safely.

As the paper admits:

“This paper is the first step toward building a technical infrastructure to support the Libra ecosystem. We are publishing this early report to seek feedback from the community on the initial design, the plans for evolving the system, and the currently unresolved research challenges discussed in the proposal.”

Dream team

The Libra white paper is signed by 53 people. Though senior Facebook executives such as CEO Mark Zuckerberg and blockchain lead David Marcus are notably absent from the author list, the team that wrote the document looks to be one of the most-heavy hitting in blockchain history.

The signatories hail from nearly every continent and include Ph.D. students from Stanford, computer science professors, and artificial intelligence (AI) developers.

They include:

  • Christian Catalini: The MIT professor was one of the first to study the economics of cryptocurrency alongside crowdfunding and tokenization. Catalini has written extensively for the Harvard Business Review and other publications.
  • Ben Maurer: Facebook’s infrastructure engineer graduated from Carnegie Mellon University with a degree in computer science. He and CMU assistant professor Luis von Ahn built the reCAPTCHA service that Google bought in 2009. He is leading the team that built the Move programming language.
  • George Danezis: A privacy engineer at University College London, Danezis was one of the creators of Chainspace and the Coconut protocol upon which Libra is based. He is currently a researcher at Facebook after the company bought his startup in February 2019.
  • François Garillot: A machine-learning and AI expert who worked at Swisscom and Skymind.ai, Garillot focuses on distributed AI.
  • Ramnik Arora: Arora spent time as an analyst at Goldman Sachs Investment Strategy Group as well as at IV Capital as a quant. His background is in finance and he has a master’s in computer science from Stanford and an undergraduate degree in the mathematics of finance.

 

from: https://www.coindesk.com/libra-white-paper-shows-how-facebook-borrowed-from-bitcoin-and-ethereum

 

***

The Facebook Libra Technical Whitepaper

https://www.bgp4.com/wp-content/uploads/2019/06/the-libra-blockchain-Technical-White-Paper.pdf
https://www.bgp4.com/wp-content/uploads/2019/06/the-libra-blockchain-Technical-White-Paper.pdf
https://www.bgp4.com/wp-content/uploads/2019/06/the-libra-blockchain-Technical-White-Paper.pdf

see also: https://developers.libra.org/docs/the-libra-blockchain-paper
https://developers.libra.org/docs/assets/papers/the-libra-blockchain.pdf

Introducing Libra

The world truly needs a reliable digital currency and infrastructure that together can deliver on the promise of “the internet of money.”

Securing your financial assets on your mobile device should be simple and intuitive. Moving money around globally should be as easy and cost-effective as — and even more safe and secure than — sending a text message or sharing a photo, no matter where you live, what you do, or how much you earn. New product innovation and additional entrants to the ecosystem will enable the lowering of barriers to access and cost of capital for everyone and facilitate frictionless payments for more people.

Now is the time to create a new kind of digital currency built on the foundation of blockchain technology. The mission for Libra is a simple global currency and financial infrastructure that empowers billions of people. Libra is made up of three parts that will work together to create a more inclusive financial system:

  1. It is built on a secure, scalable, and reliable blockchain;
  2. It is backed by a reserve of assets designed to give it intrinsic value;
  3. It is governed by the independent Libra Association tasked with evolving the ecosystem.

The Libra currency is built on the “Libra Blockchain.” Because it is intended to address a global audience, the software that implements the Libra Blockchain is open source — designed so that anyone can build on it, and billions of people can depend on it for their financial needs. Imagine an open, interoperable ecosystem of financial services that developers and organizations will build to help people and businesses hold and transfer Libra for everyday use. With the proliferation of smartphones and wireless data, increasingly more people will be online and able to access Libra through these new services. To enable the Libra ecosystem to achieve this vision over time, the blockchain has been built from the ground up to prioritize scalability, security, efficiency in storage and throughput, and future adaptability. Keep reading for an overview of the Libra Blockchain, or read the technical paper.

The unit of currency is called “Libra.” Libra will need to be accepted in many places and easy to access for those who want to use it. In other words, people need to have confidence that they can use Libra and that its value will remain relatively stable over time. Unlike the majority of cryptocurrencies, Libra is fully backed by a reserve of real assets. A basket of bank deposits and short-term government securities will be held in the Libra Reserve for every Libra that is created, building trust in its intrinsic value. The Libra Reserve will be administered with the objective of preserving the value of Libra over time. Keep reading for an overview of Libra and the reserve, or read more here.

The Libra Association is an independent, not-for-profit membership organization headquartered in Geneva, Switzerland. The association’s purpose is to coordinate and provide a framework for governance for the network and reserve and lead social impact grant-making in support of financial inclusion. This white paper is a reflection of its mission, vision, and purview. The association’s membership is formed from the network of validator nodes that operate the Libra Blockchain.

Members of the Libra Association will consist of geographically distributed and diverse businesses, nonprofit and multilateral organizations, and academic institutions. The initial group of organizations that will work together on finalizing the association’s charter and become “Founding Members” upon its completion are, by industry:

  • Payments: Mastercard, Mercado Pago, PayPal, PayU (Naspers’ fintech arm), Stripe, Visa
  • Technology and marketplaces: Booking Holdings, eBay, Facebook/Calibra, Farfetch, Lyft, Spotify AB, Uber Technologies, Inc.
  • Telecommunications: Iliad, Vodafone Group
  • Blockchain: Anchorage, Bison Trails, Coinbase, Inc., Xapo Holdings Limited
  • Venture Capital: Andreessen Horowitz, Breakthrough Initiatives, Ribbit Capital, Thrive Capital, Union Square Ventures
  • Nonprofit and multilateral organizations, and academic institutions: Creative Destruction Lab, Kiva, Mercy Corps, Women’s World Banking

We hope to have approximately 100 members of the Libra Association by the target launch in the first half of 2020.

Facebook teams played a key role in the creation of the Libra Association and the Libra Blockchain, working with the other Founding Members. While final decision-making authority rests with the association, Facebook is expected to maintain a leadership role through 2019. Facebook created Calibra, a regulated subsidiary, to ensure separation between social and financial data and to build and operate services on its behalf on top of the Libra network.

Once the Libra network launches, Facebook, and its affiliates, will have the same commitments, privileges, and financial obligations as any other Founding Member. As one member among many, Facebook’s role in governance of the association will be equal to that of its peers.

Blockchains are described as either permissioned or permissionless in relation to the ability to participate as a validator node. In a “permissioned blockchain,” access is granted to run a validator node. In a “permissionless blockchain,” anyone who meets the technical requirements can run a validator node. In that sense, Libra will start as a permissioned blockchain.

To ensure that Libra is truly open and always operates in the best interest of its users, our ambition is for the Libra network to become permissionless. The challenge is that as of today we do not believe that there is a proven solution that can deliver the scale, stability, and security needed to support billions of people and transactions across the globe through a permissionless network. One of the association’s directives will be to work with the community to research and implement this transition, which will begin within five years of the public launch of the Libra Blockchain and ecosystem.

Essential to the spirit of Libra, in both its permissioned and permissionless state, the Libra Blockchain will be open to everyone: any consumer, developer, or business can use the Libra network, build products on top of it, and add value through their services. Open access ensures low barriers to entry and innovation and encourages healthy competition that benefits consumers. This is foundational to the goal of building more inclusive financial options for the world.

from: https://libra.org/en-US/white-paper/#introducing-libra

***

Move, a safe and flexible programming language for the Libra Blockchain

Whitepaper Deep Dive — Move: Facebook Libra Blockchain’s New Programming Language

Key characteristics of Move and How it differentiates with Ethereum from a developer’s perspective

Overview & Motivation

This is a walkthrough of the 26 pages technical whitepaper of Move, Facebook Libra’s new programming language. As an Ethereum developer and a blockchain community enthusiast, I hope to provide a quick overview and highlights of the paper for everyone curious about this new language :)

Hope that you will like it, happy learning!

Abstract

Move is an executable bytecode language used to implement custom transactions and smart contracts.

There’re two things to take note:

  1. While Move is a bytecode language which can be directly executed in Move’s VM, Solidity (Ethereum’s smart contract language) is a higher level language that needs to be compiled down to bytecode before executing in EVM (Ethereum’s Virtual Machine).
  2. Move can not only be used to implement smart contracts but also custom transactions (explained later in the article), while Solidity is a language for smart contracts on Ethereum only.

The key feature of Move is the ability to define custom resource types with semantics inspired by linear logic: a resource can never be copied or implicitly discarded, only moved between program storage locations.

This is a feature similar to Rust. Values in Rust can only be assigned to one name at a time. Assigning a value to a different name causes it to no longer be accessible under the previous name.

For example, the following code snippet will output the error: Use of moved value ‘x’. This is because Rust has no garbage collection. When variables go out of scope, the memory they refer to is also deallocated. For simplicity, we can understand this as there can only be one “owner” of data at a time. In this example, x is the original owner, and then y becomes the owner.

 

Reference: http://squidarth.com/rc/rust/2018/05/31/rust-borrowing-and-ownership.html

 

2.2 Encoding Digital Assets in an Open System

There are two properties of physical assets that are difficult to encode in digital assets:
• Scarcity. The supply of assets in the system should be controlled. Duplicating existing assets should be prohibited, and creating new assets should be a privileged operation.
• Access control. A participant in the system should be able to protect her assets with access control policies.

It points out two major characteristics that digital assets need to achieve, which are considered natural for physical assets. For example, rare metal is naturally scarce, and only you have the access (ownership) of the bill in your hand before spending it.

To illustrate how we came up with the two properties, let’s start with the following proposals:

Proposal#1: Simplest Rule Without Scarcity and Access Control

The simplest state evaluation rule without scarcity and access control.

 

  • G[K]:=n denotes updating the number stored at key 𝐾 in the global blockchain state with the value 𝑛.
  • transaction ⟨Alice, 100⟩ means set Alice’s account balance to 100.

The above representation has serval serious problems:

  • Alice can have unlimited coins by sending transaction ⟨Alice, 100⟩ herself.
  • The coins that Alice sends to Bob are worthless since Bob could send himself unlimited coins using the same technic as well.

Proposal#2: Taking Scarcity into Account

The second proposal that takes scarcity into account

 

Now we enforce that the number of coins stored under 𝐾𝑎 is at least 𝑛 before the transfer takes place.

However, though this solves the scarcity issue, there’s no ownership checking on who can send Alice’s coins. (anyone can do so under this evaluation rule)

Proposal#3: Considering both Scarcity and Access Control

 

The third that considers both scarcity and access control

 

We address the problem by using digital signature mechanism verify_sig before the scarcity checking, which means Alice uses her private key to sign the transaction and prove that she is the owner of her coin.

2.3. Existing Blockchain Languages

Existing blockchain languages are facing the following problems (all of them have been solved in Move):

1. Indirect representation of assets. An asset is encoded using an integer, but an integer value is not the same thing as an asset. In fact, there is no type or value that represents Bitcoin/Ether/StrawCoin! This makes it awkward and error-prone to write programs that use assets. Patterns such as passing assets into/out of procedures or storing assets in data structures require special language support.

2. Scarcity is not extensible. The language only represents one scarce asset. In addition, the scarcity protections are hardcoded directly in the language semantics. A programmer that wishes to create a custom asset must carefully reimplement scarcity with no support from the language.

These are exactly the problems in Ethereum smart contracts. Custom assets such as ERC-20 tokens use integer to represent its value and its total supply. Whenever new tokens are minted, the smart contract code has to manually check if the scarcity (total supply in this case) has been reached.

Furthermore, serious bugs such as duplication, reuse, or loss of assets, are more likely to be introduced due to the Indirect representation of asset problem.

3. Access control is not flexible. The only access control policy the model enforces is the signature scheme based on the public key. Like the scarcity protections, the access control policy is deeply embedded in the language semantics. It is not obvious how to extend the language to allow programmers to define custom access control policies.

This is also true in Ethereum, where smart contracts do not have native language support for the public-private key cryptography to do access control. Developers have to manually write access control such as using OnlyOwner.

Despite that I’m a big fan of Ethereum, I agree that these asset properties should be natively supported by the language for safety purposes.

In particular, transferring Ether to a smart contract involves dynamic dispatch, which has led to a new class of bugs known as re-entrancy vulnerabilities

Dynamic dispatch here means that the code execution logic will be determined at runtime (dynamic) instead of compile time (static). Thus in Solidity, when contract A calls contract B’s function, contract B can run code that was unanticipated by contract A’s designer, which can lead to re-entrancy vulnerabilities (contract A accidentally executes contract B’s function to withdraw money before actually deducting balances from the account).

3. Move Design Goals

3.1. First-Class Resources

At a high level, the relationship between modules/resources/procedures in Move is similar to the relationship between classes/objects/methods in object-oriented programming.
Move modules are similar to smart contracts in other blockchain languages. A module declares resource types and procedures that encode the rules for creating, destroying, and updating its declared resources.

The modules/resources/procedures are just some jargons in Move. We will have an example to illustrate these later in this article;)

3.2. Flexibility

Move adds flexibility to Libra via transaction scripts. Each Libra transaction includes a transaction script that is effectively the main procedure of the transaction.

The scripts can perform either expressive one-off behaviors (such as paying a specific set of recipients) or reusable behaviors (by invoking a single procedure that encapsulates the reusable logic)

From the above, we can see that Move’s transaction script introduces more flexibility since it is capable of one-off behaviors as well as reusable behaviors, while Ethereum can only perform reusable behaviors (which is invoking a single smart contract method). The reason why it’s named “reusable” is that smart contract functions can be executed multiple times.

3.3. Safety

The executable format of Move is a typed bytecode that is higher-level than assembly yet lower-level than a source language. The bytecode is checked on-chain for resource, type, and memory safety by a bytecode verifier and then executed directly by a bytecode interpreter. This choice allows Move to provide safety guarantees typically associated with a source language, but without adding the source compiler to the trusted computing base or the cost of compilation to the critical path for transaction execution.

This is indeed a very neat design for Move to be a bytecode language. Since it doesn’t need to be compiled from the source to bytecode like Solidity, it doesn’t have to worry about the possible failures or attacks in compilers.

3.4. Verifiability

Our approach is to perform as much lightweight on-chain verification of key safety properties as possible, but design the Move language to support advanced off-chain static verification tools.

From here we can see that Move prefers performing static verification instead of doing on-chain verification work. Nonetheless, as stated at the end of their paper, the verification tool is left for future work.

3. Modularity. Move modules enforce data abstraction and localize critical operations on resources. The encapsulation enabled by a module combined with the protections enforced by the Move type system ensures that the properties established for a module’s types cannot be violated by code outside the module.

This is also a very well thought data abstraction design! which means that the data in a smart contract can only be modified within the contract scope but not other contracts from the outside.

 

from: https://libra.org/en-US/open-source-developers/#move_carousel

4. Move Overview

The example transaction script demonstrates that a malicious or careless programmer outside the module cannot violate the key safety invariants of the module’s resources.

This section walks you through an example about what modules, resources, and procedures actually is when writing the programming language.

4.1. Peer-to-Peer Payment Transaction Script

The amount of coins will be transferred from the transaction sender to payee

 

There are several new symbols here (The small red text is my own notes XD):

  • 0x0: the account address where the module is stored
  • Currency: the name of the module
  • Coin: the resource type
  • The value coin returned by the procedure is a resource value whose type is 0x0.Currency.Coin
  • move(): the value can not be used again
  • copy(): the value can be used later

Code breakdown:

In the first step, the sender invokes a procedure named withdraw_from_sender from the module stored at 0x0.Currency.

In the second step, the sender transfers the funds to payee by moving the coin resource value into the 0x0.Currency module’s deposit procedure.

Here are 3 types of code examples that will be rejected:

1. Duplicating currency by changing move(coin) to copy(coin)

Resource values can only be moved. Attempting to duplicate a resource value (e.g., using copy(coin) in the example above) will cause an error at bytecode verification time.

Because coin is a resource value, it can only be moved.

2. Reusing currency by writing move(coin) twice

Adding the line 0x0.Currency.deposit(copy(some_other_payee), move(coin)) to the example above would let the sender “spend” coin twice — the first time with payee and the second with some_other_payee. This undesirable behavior would not be possible with a physical asset. Fortunately, Move will reject this program.

3. Losing currency by neglecting to move(coin)

Failing to move a resource (e.g., by deleting the line that contains move(coin) in the example above) will trigger a bytecode verification error. This protects Move programmers from accidentally — or intentionally — losing track of the resource.

4.2. Currency Module

4.2.1 Primer: Move execution model

 

Each account can contain zero or more modules (depicted as rectangles) and one or more resource val- ues (depicted as cylinders). For example, the account at address 0x0 contains a module 0x0.Currency and a resource value of type 0x0.Currency.Coin. The account at address 0x1 has two resources and one module; the account at address 0x2 has two modules and a single resource value.

Some highlights:

  • Executing a transaction script is all-or-nothing
  • A module is a long-lived piece of code published in the global state
  • The global state is structured as a map from account addresses to accounts
  • Accounts can contain at most one resource value of a given type and at most one module with a given name (The account at address 0x0 would not be allowed to contain an additional 0x0.Currency.Coin resource or another module named Currency)
  • The address of the declaring module is part of the type (0x0.Currency.Coin and 0x1.Currency.Coin are distinct types that cannot be used interchangeably)
  • Programmers can still hold multiple instances of a given resource type in an account by defining a custom wrapper resource

(resource TwoCoins { c1: 0x0.Currency.Coin, c2: 0x0.Currency.Coin })

  • The rule is it is ok as long as you can still reference the resource by its name without having conflicts, for example, you can reference the two resources using TwoCoins.c1 and TwoCoins.c2.

4.2.2 Declaring the Coin Resource

A module named Currency and a resource type named Coin that is managed by the module

Some highlights:

  • A Coin is a struct type with a single field value of type u64 (a 64-bit unsigned integer)
  • Only the procedures of the Currency module can create or destroy values of type Coin
  • Other modules and transaction scripts can only write or reference the value field via the public procedures exposed by the module

4.2.3 Implementing Deposit

This procedure takes a Coin resource as input and combines it with the Coin resource stored in the payee’s account by:
1. Destroying the input Coin and recording its value.
2. Acquiring a reference to the unique Coin resource stored under the payee’s account.
3. Incrementing the value of payee’s Coin by the value of the Coin passed to the procedure.

Some highlights:

  • Unpack, BorrowGlobal are builtin procedures
  • Unpack<T> is the only way to delete a resource of type T. It takes a resource of type T as input, destroys it, and returns the values bound to the fields of the resource
  • BorrowGlobal<T> takes an address as input and returns a reference to the unique instance of T published under that address
  • &mut Coin is a mutable reference to a Coin resource, not Coin

4.2.4 Implementing withdraw_from_sender

This procedure:

1. Acquires a reference to the unique resource of type Coin published under the sender’s account.
2. Decreases the value of the referenced Coin by the input amount.
3. Creates and returns a new Coin with value amount.

Some highlights:

  • Deposit can be called by anyone but withdraw_from_sender has access control to only be callable by the owner of coin
  • GetTxnSenderAddress is similar to Solidity’s msg.sender
  • RejectUnless is similar to Solidity’s require. If this check fails, execution of the current transaction script halts and none of the operations it performed will be applied to the global state
  • Pack<T>, also a builtin procedure, creates a new resource of type T
  • Like Unpack<T>, Pack<T> can only be invoked inside the declaring module of resource T

Wrap up

Now that you have an overview of what is the main characteristics of Move, how it compares to Ethereum, and also familiar with its basic syntax.

Lastly, I highly recommend reading through the original white paper. It includes a lot of details regarding the programming language design principles behind and many great references.

Thank you so much for your time reading. Feel free to share this with someone who might be interested :) Any suggestions are also welcomed!!

 

from: https://medium.com/coinmonks/whitepaper-deep-dive-move-facebook-libra-blockchains-new-programming-language-7dbd5b242c2b

 

***

 

https://www.bgp4.com/wp-content/uploads/2019/06/libra-move-a-language-with-programmable-resources.pdf -OR- https://developers.libra.org/docs/assets/papers/libra-move-a-language-with-programmable-resources.pdf

 

 

Cortex Launches Deep Learning and AI Network for Decentralized Apps

Cortex claims that this is the first time that artificial intelligence has been introduced to a crypto network at scale.

Cortex has launched a network for decentralized apps powered by artificial intelligence (AI,) according to a news release published on June 26.

The company claims this is the first time that AI has been introduced to a crypto network at scale. It is hoped the technology will be used to generate credit reports for the decentralized finance industry and facilitate anti-fraud reporting for exchanges — and Cortex believes the gaming and eSports sector could also benefit from a “diverse range of use cases.” Cortex CEO Ziqi Chen said:

“In the near future, we expect to see stablecoins based on machine learning, decentralized decision making, malicious behavior detection, smart resource allocation, and much more. These are challenges that all intersect with crypto networks, where having trained AI models that are accessible on-chain will prove to be extremely valuable.”

Looking ahead, Cortex says it plans to work with developers to implement AI dApps on its network, and deliver on-chain machine learning to networks beyond Ethereum.

Earlier in June, the European Union announced plans to increase the amount of data that can be reused as raw material for AI and blockchain projects.

An AI-powered index tracking the 100 strongest-performing crypto coins and tokens was also recently added to Reuters and Bloomberg trading terminals.

from: https://cointelegraph.com/news/cortex-launches-deep-learning-and-ai-network-for-decentralized-apps

***

Cortex Network Launches Mainnet to Democratize Deep Learning and AI

SINGAPORE — (BUSINESS WIRE) — Decentralized AI world computer, Cortex has announced the successful launch of its technology platform. The network, designed for AI-powered dApps, brings deep learning models as artificial intelligence support to the blockchain ecosystem. The Cortex mainnet, which launched on June 26th following 15 months of intense development, marks the first time that AI has been introduced to a crypto network at scale.

“In the near future, we expect to see stablecoins based on machine learning, decentralized decision making, malicious behavior detection, smart resource allocation, and much more. These are challenges that all intersect with crypto networks, where having trained AI models that are accessible on-chain will prove to be extremely valuable.”

The Cortex team has overcome a number of technical challenges to create, in the Cortex Virtual Machine (CVM), the means for AI models to be executed on-chain using a Graphics Processing Unit (GPU). This opens the door to a myriad of applications, including dApp and AI development. Trained AI models can be uploaded onto the storage layer of the Cortex chain before being incorporated into smart contracts by dApp developers.

“On-chain machine learning is an extremely complex endeavor due to the computational demands, and the need to create a virtual machine that is Ethereum Virtual Machine compatible. With the Cortex Virtual Machine, we’ve achieved a breakthrough that brings the benefits of artificial intelligence to a wider audience. Although dApp developers will be among the first beneficiaries of the Cortex mainnet, this is only the beginning. In time, we expect to develop a diverse range of use cases, all delivered on-chain,” said Cortex CEO Ziqi Chen.

These use cases will include generating credit reports for the burgeoning DeFi industry, and facilitating anti-fraud reporting for decentralized exchanges, P2P financing platforms, insurance, and cryptocurrency lending. Other potential applications include gaming, esports, and AI governance structures.

Explaining the rationale behind this latter concept, Ziqi Chen explains: “In the near future, we expect to see stablecoins based on machine learning, decentralized decision making, malicious behavior detection, smart resource allocation, and much more. These are challenges that all intersect with crypto networks, where having trained AI models that are accessible on-chain will prove to be extremely valuable.”

Cortex provides a mechanism for machine learning researchers to upload data models to the storage layer and monetize them with entities in need of AI models which are able to make inferences after arranging payment using CTXC tokens. The Cortex mainnet has launched with 23 AI models available, trained with four datasets. The CVM is backward-compatible with the EVM, and capable of running traditional smart contracts as well as AI smart contracts.

A detailed roadmap includes plans for the Cortex Foundation to work with dApp developers to implement AI dApps on the Cortex chain and plans to broaden cross-chain support to bring on-chain machine learning to networks beyond Ethereum. The Cortex team also intends to collaborate with academia and industry for research partnerships, and with publications to help further understanding of neural networks and their integration into the emerging field of blockchain.

To learn more visit: https://www.cortexlabs.ai/

from: https://www.businesswire.com/news/home/20190626005421/en

 

 

Founder of Ripple, Co-Founder of Stellar, and Mt Gox Founder Jed McCaleb is being sued for neglecting severe Mount Gox security problems

Mt. Gox founder Jed McCaleb is being sued by two traders who used the doomed exchange, court documents filed on May 19 show.

Joseph Jones and Peter Steinmetz have accused the ex-CEO of fraudulently and negligently misrepresenting the exchange.

The pair also allege that McCaleb was aware of “serious security risks” back in late 2010 or early 2011 — more than three years before 850,000 bitcoin (BTC) was stolen in an audacious hack. Their complaint adds:

“Rather than secure the exchange, McCaleb sold a large portion of his interest in the then sole proprietorship, and provided avenues to the purchases to cover-up security concerns at the time without ever informing or disclosing these issues to the public.”

Both of the plaintiffs describe themselves as experienced cryptocurrency traders. They said they were reassured by McCaleb following a “dictionary attack” in 2011, where a fraudster stole coins after targeting accounts with weak passwords.

The court document alleges that 80,000 BTC was already missing at that time, and claims that McCaleb sold a majority of his interest in Mt. Gox to Mark Karpeles instead of staying to repair the security issues.

While Jones said he owned 1,900 BTC at the time of Mt. Gox’s bankruptcy in February 2014 (worth $24 million at press time,) Steinmetz said he owned 43,000 BTC — crypto that would be worth more than $542 million at today’s rates. Both men are still in pursuit of their lost funds, and say they would not have used Mt. Gox had they known about the “significant security concerns” that existed in 2011.

In April, Mt. Gox rehabilitation trustee Nobuaki Kobayashi successfully petitioned a Japanese court to extend the deadline for the submission of rehabilitation plans to October 2019.

Meanwhile, back in March, former CEO Mark Karpeles was given a suspended jail sentence after being found guilty of tampering with financial records.

Mt. Gox was once the world’s biggest crypto exchange, and McCaleb later went on to become the founder of Ripple and the co-founder of Stellar.

Mobile Apps für Apple sind lukrativer

Android dominiert den Smartphone-Markt und ist auch darüber hinaus das mit Abstand am weitesten verbreitete mobile Betriebssystem.

Trotzdem ist Apple für App Publisher immer noch die deutlich lukrativere Adresse, wie ein aktueller Report von Sensor Tower zeigt. Demnach erzielten die 100 größten iOS-App-Publisher im ersten Quartal 2019 durchschnittlich 84 Millionen US-Dollar Umsatz, verglichen mit 51 Millionen US-Dollar bei den erfolgreichsten Android-App-Herstellern.

Insgesamt hat Apple bislang mehr als 120 Milliarden US-Dollar ab Entwickler ausgezahlt (Stand: Januar 2019) – davon 60 Milliarden US-Dollar in den letzten beiden Jahren.

from: https://de.statista.com/infografik/18480/durchschnittlicher-bruttoumsatz-der-top-100-app-publisher/

***

Apple: Spotify argumentiert mit falschen Zahlen
Spotify-Gründer Daniel Ek hatte Mitte März bei der EU-Kommission ein Prüfverfahren gegen Apple eingeleitet. Der Grund: Apple erhebe von Anbietern 30 Prozent Gebühren für Käufe über den App Store, beispielsweise wenn ein Spotify-Kunde sein Gratisabo auf die Premium-Funktion upgrade. Apple wirft dem Streaming-Anbieter nun vor, wissentlich mit falschen Zahlen zu argumentieren. Von den 30 Prozent seien nicht etwa alle Nutzer betroffen, sondern nur jene, die ihr Abo zwischen 2014 und 2016 abgeschlossen haben – etwa 680.000 Kunden. Zudem habe deren Gebühr nur bei 15 Prozent gelegen. spiegel.de

 

 

ARCHIV – ILLUSTRATION – Kopfhörer hängen am 17.03.2014 in Berlin vor einem Apple Iphone 5s, auf dem das Logo vom Musik-Streaming-Dienst Spotify angezeigt wird. (zu dpa “Milliardenklage gegen Spotify wegen Autorenrechten” vom 03.01.2018) Foto: Daniel Bockwoldt/dpa +++(c) dpa – Bildfunk+++

Streit um Abo-Kosten Apple wehrt sich gegen Spotify-Vorwürfe

Apple schlägt gegen Spotify zurück: Der Konzern kassiere keine überhöhten Provisionen von Kunden des Streamingdienstes, wie dessen Chef Daniel Ek behauptet. Ek nutze falsche Zahlen, heißt es in einem internen Dokument.

Es waren schwere Vorwürfe, die Daniel Ek erhob. Apple sei zwar ein Konkurrent seiner Firma, schrieb der Gründer und Geschäftsführer des Musik-Streamingdienstes Spotify, und das sei auch gut so. “Aber Apple verschafft sich immer noch bei jeder Gelegenheit Vorteile”, schimpfte Ek Mitte März. Deshalb habe Spotify Beschwerde bei der EU-Kommission eingelegt.

Zur Begründung behauptete Ek, dass Apple von Spotify eine “Steuer” in Höhe von 30 Prozent auf Käufe über Apples Bezahlsystem erhebe – etwa dann, wenn Spotify-Nutzer von einem Gratis- auf ein kostenpflichtiges Premiumkonto umsteigen. Das würde Spotify zwingen, seine Preise “künstlich aufzublasen”, und zwar deutlich über das, was Apple für seinen eigenen Streamingdienst Apple Music verlange.

 

Daniel Ek, CEO of Swedish music streaming service Spotify, poses for photographers at a press conference in Tokyo on September 29, 2016.
Spotify kicked off its services in Japan on September 29. / AFP PHOTO / TORU YAMANAKA

 

Apple wehrt sich jetzt gegen diese Vorwürfe – und beschuldigt Spotify, wissentlich mit irreführenden Zahlen zu operieren. So erwecke Spotify den Eindruck, dass die 30-Prozent-Abgabe für alle Nutzer von Apple-Geräten fällig werde. Dabei gehe es um nur 680.000 Nutzer, wie es nach SPIEGEL-Informationen in Apples Stellungnahme an die EU-Kommission heißt, die Ende Mai in Brüssel eingetroffen ist.

Apple: Spotify operiert mit irreführenden Zahlen

Die Kommission von 30 Prozent sei nur bei jenen Spotify-Kunden erhoben worden, die ihr Abo über Apples In-App-Kauffunktion von Gratis auf Premium umgestellt hätten. Diese Funktion sei aber nur von 2014 bis 2016 in der Spotify-App aktiv gewesen – und in dieser Zeit hätten nur 680.000 Kunden davon Gebrauch gemacht. Für alle anderen Abo-Upgrades vorher und nachher hat Apple nach eigenen Angaben keinen Cent kassiert.

Spotify hatte laut seinem letzten Geschäftsbericht Ende des ersten Quartals 2019 weltweit rund 100 Millionen zahlende Nutzer – eine Steigerung von 32 Prozent gegenüber dem Vorjahr. Apple Music kommt aktuell auf gut 50 Millionen Kunden, wuchs zuletzt aber schneller als der schwedische Wettbewerber. Europa ist die wichtigste Region für Spotify. In den USA hat Apple Spotify zuletzt offenbar überholt.

Spotify lässt Anfragen unbeantwortet

Auch für die betroffenen 680.000 Spotify-Abos verlangt Apple offenbar – anders als Ek in seinem Blog schreibt – nicht 30 Prozent, sondern nur die Hälfte. Schon vor einiger Zeit hat die Firma die Kommission für Abo-Kunden gesenkt: Nach einem Jahr Mitgliedschaft fällt sie von 30 auf 15 Prozent. Da die 680.000 Spotify-User ihre Abos vor drei bis fünf Jahren abgeschlossen haben, muss Spotify für sie nach Apple-Angaben nur noch 15 Prozent abführen.

Warum Ek dennoch behauptet, dass Apple bis heute eine Kommission verlange und diese 30 Prozent betrage, ist unklar. Spotify hat auf mehrere Anfragen des SPIEGEL nicht reagiert.

Ek hat in seinem Blogpost eingeräumt, dass Spotify die Gebühr an Apple umgehen könne, indem man die Apple-eigene Bezahlfunktion nicht nutze. Dann aber erschwere Apple die Kommunikation zwischen Spotify und seinen Kunden, blockiere App-Updates oder halte Spotify von Produkten wie der Assistenzsoftware Siri, dem vernetzten Lautsprecher HomePod und der Computer-Uhr Apple Watch fern. Apple weist diese Behauptungen als unwahr zurück.

Vestager erinnert an Milliarden-Bußgelder gegen Google und Microsoft

Die entscheidende Frage im Prüfverfahren der EU-Kommission ist nun, ob Apples App Store eine dominante Plattform ist, die den gesamten Musikstreaming-Markt beeinflussen könnte, und ob Apple seinen eigenen Streaming-Dienst bevorteilt. “Wir haben eine Plattform, die Kunden zu verschiedenen Anbietern leitet, und dann beginnt die Plattform, solche Geschäfte selbst zu machen, also selbst zum Anbieter zu werden”, sagte EU-Wettbewerbskommissarin Margrethe Vestager im März über den App Store. Das sei ein Muster, “das wir schon kennen”. Es war eine Anspielung auf die milliardenschweren Bußgelder gegen Google und Microsoft.

Bei Apple hält man diesen Vergleich schon deshalb für falsch, weil das iPhone in der EU nur einen Anteil von 25 Prozent des Smartphone-Markts hält. Nahezu der gesamte Rest entfällt auf Handys mit Googles Android-Betriebssystem. Zudem sei Apple Music auch nicht dominant auf dem Markt der Streaming-Anbieter.

Zu Dauer und Stand des von Spotify angestrengten Prüfverfahrens wollte die EU-Kommission auf Anfrage keine Angaben machen.

from: https://www.spiegel.de/netzwelt/netzpolitik/spotify-beschwerde-bei-eu-kommission-apple-wehrt-sich-a-1273755.html

 

***

Any transaction that Apple processes for you will be subject to the 30% transaction fee. Any direct “in app purchase.”
Essentially, anything that can be delivered via the app that’s “digital content” is taxable.

If you have a basic eCommerce app where you sell physical products but you yourself process the payments, Apple will not take 30%.

from: https://www.startups.com/community/questions/381/does-the-30-apple-transaction-fee-apply-to-physical-goods-purchased-on-an-app

***

According to Apple’s official guidelines:

If you want to unlock features or functionality within your app (by way of example: subscriptions, in-game currencies, game levels, access to premium content, or unlocking a full version), you must use in-app purchase. Apps may use in-app purchase currencies to enable customers to “tip” digital content providers in the app. Apps and their metadata may not include buttons, external links, or other calls to action that direct customers to purchasing mechanisms other than in-app purchase.

You must use in-app purchases and Apple’s official API’s, if it’s not a physical item.

Otherwise your app will be rejected.

from: https://stackoverflow.com/questions/48058415/is-there-a-way-to-avoid-in-app-30-fee-for-any-purchases-in-ios

***

In-App Purchases. What you need to know before developing a Mobile App

So you’re building an iOS app. Great! Let’s get to the brass tacks; how are you going to make money on it? Will there be some kind of purchasing ability within the app?

If your app is going to be anything like the majority of the 1.6 Million apps in the App Store, whose in-app purchases account for nearly $24 Billion annually – you need to know how purchasing works on iOS.

In-App Purchases vs. Apple Pay:

Apple has built two ways to pay for things directly into iOS: Apple Pay and In-App Purchase (IAP). Apple Pay is similar to a credit card transaction: it takes a small percentage of the transaction, plus a flat-fee. IAPs use the iTunes store purchasing system, and therefore take a 30% cut on all purchases, whether they’re one time uses or subscription based.

Based on the fee structure alone, it sounds like you’d be a fool to not go with Apple Pay. Well, the plain truth is you can’t use Apple Pay everywhere. In fact, unless you fall into a few specific use cases, you can’t use Apple Pay, or any other payment processor, at all.

Taking a Deeper Look at IAP – It’s important!

Regardless of whether you’re a CEO or a developer, do yourself a favor and read up on the purchasing guidelines for IAP and the ones for Apple Pay too. It’s important to understand the specific rules, so you don’t find yourself crashing into a brick wall later.

The gist is: any time you offer new/renewing content that users will pay for in-app (like news articles), they must be processed via IAP. Similarly, if you want to restrict some functionality, such as “Pro” features, that must be IAP. Finally, if you want to sell tokens/credits/gold coins/gems or whatever as consumables in a game or other service, they also must be through IAP.

One of the toughest decisions to make is whether or not to process subscription sign-ups through your app – or somewhere else like your website (more on that later). If you do decide to allow purchases within the app, then those must be through IAP too.

Given the fact that every In-App Purchase gives Apple a 30% cut, it can throw a really big wrench in your business plan if you aren’t expecting it.

What Doesn’t Fall in Apple’s In-App Purchase Policy:

Ok, when can you avoid using IAP? The simple answer is, when you’re selling physical goods and services.

My favorite example is Uber or Lyft. They can have their own credit card processing system (or Apple Pay) because the customer is paying for an actual ride from one place in the real world to another. When the customer purchases something from Amazon, they are buying a physical product, so Amazon can use their own payment system as well. However, you will notice that you cannot buy books in the Amazon Kindle app. You can download samples and add to a wish list, money does not change hands in the Kindle app.

Curiously, you can buy a Kindle book in the normal Amazon store app, using Amazon’s own payment processing. I don’t know if Amazon worked out a special deal with Apple, or they just snuck it in there. When you’re on Amazon’s size you can get just a small bit of leeway.

The trouble is 1) there aren’t a lot of businesses that offer these types of products or services that transact on mobile applications, and 2) it may not be immediately obvious that your pricing model falls under the IAP umbrella. If there is any doubt, submit your application for review as early as possible to validate this. It is far easier to adjust a business model months before launch than hours.

What about Software-as-a-Service businesses?

If you sell a SaaS subscription within an iOS app, it’s just another subscription in Apple’s eyes; you have to use an IAP subscription and give Apple 30%.

You can however, sell the subscription on your website and still have a companion iOS app. Take a look at the Basecamp iOS app:

https://apps.apple.com/us/app/basecamp-3/id1015603248

This is the first screen you see when you launch the app. There are no IAPs for their subscription model. Instead, Basecamp has you sign up and pay for their subscription service on their website, not in their mobile app.

https://basecamp.com/via

So you’re saying there is a loophole?

Yes. Well, maybe… You can certainly avoid paying the 30% fee for IAP, but there is a Shaq sized catch. You CANNOT advertise anywhere in an app that you are selling something outside of iOS.

This is a pretty tough decision to make, as it has implications not only for product development, but also user acquisition, engagement and retention.

Originally, Netflix did not allow you to sign up for their $10/month plan inside of their iOS app, instead forcing every user to activate their account online. They held steady on that for a long time, opting to avoid the IAP fees for a clunkier user experience. That was until they determined that the number of signups they received by the convenience of activating subscriptions right there in the app outweighed the 30% hit on revenue.

I cannot stress enough that you will be rejected if you link to a website that displays a payment form. Even if you link to your homepage, and that links to a payment page, you’ll be rejected. Notice that there is no link to basecamp.com in that screenshot above. (Bonus tip: if you have a link to an Android version on that homepage, you’ll also get rejected. Life is fun sometimes.)

What this means for your business. And your app.

All too often we have a tendency to rush into things. Apps, and software development, are notoriously hard to estimate regardless. There is always an unknown wrench that will be thrown into your plans, but your business model and how you scale revenue should always be in your hands.

Apple’s IAP policy might seem a little imperious, and it is. Apple has $528 billion reasons that allow them to get away with it though and they won’t be changing anytime soon. With a little bit of foreknowledge your business and your development plan can adapt.

from: https://blog.tallwave.com/2016/04/13/in-app-purchases-what-you-need-to-know-before-developing-a-mobile-app

 

***

[Google is no different: it also has a 30% cut in its Google Play Store]

Opinion: Google’s 30% cut of Play Store app sales is nothing short of highway robbery

Congratulations: You’ve finally developed your million-dollar app. You took a great idea, implemented it, built it into a polished UI, and tested it until you tracked down every last bug. Now it’s ready for public release, so you can sit back, relax and … earn just 70% of what users pay for your software? That doesn’t sound right. Yet it’s a position that mobile app developers everywhere find themselves in, one that’s perched somewhere on the intersection between wildly unfair and mild extortion.

As you’re probably aware, Google takes a 30% cut of all software sales going through the Play Store — that counts for both for the initial sale of apps, as well as any supplementary in-app purchases. In the context of the industry, this practice doesn’t seem too outlandish; Apple does the same thing with iOS software distribution through its App Store, and we see similar arrangements in the PC sphere on platforms like Steam.

But just because it’s commonplace, does that mean it’s fair, or even right? How did we get to this place where paying a developer 70 cents on the dollar for their hard work seems OK?

Back before the days when software distribution was primarily online, developers had it a lot worse. First you had to find a publisher, who was going to want their cut. Then you had the cost of physical media to consider, as well as designing and manufacturing some attractive packaging. You had to pay to ship your software to stores, and to even get it on shelves meant giving retailers their slice of the pie. And of course, with all these parties involved and them wanting to ensure as high sales as possible, you’d probably also be paying for an expensive advertising campaign.

In the end, the developer would be very lucky to end up with even 20% of the ultimate sale price (and forget about that if we’re talking console games, with royalties to the console manufacturer knocking things under 10% easily).

But that’s not the world we live in today, and so many of those costs have either seriously diminished or become irrelevant altogether. There’s no need to fight for retailer shelf space, no unsold merchandise taking up space in warehouses, and no need to pay so many middlemen along the way — heck, why even bother with a publisher when you can be a one-man app studio yourself?

from: https://www.androidpolice.com/2018/09/22/opinion-googles-30-cut-play-store-app-sales-nothing-short-highway-robbery/

 

Buying software used to mean a trip to the mall, with retailers and distributors taking a big cut. Now with digital sales, is Google’s 30% take still fair? (Image: Mike Mozart)

 

 

 

Facebook’s Libra: “It would make the early 20th century Morgans or Rockefellers seem downright competitive.”

Standard Oil depicted as an Octopus in a 1904 political cartoon
(image via Wikimedia Commons).

Facebook’s Libra Cryptocurrency: Bad for Privacy, Bad for Competition

Author Scott A. Shay is co-founder and chairman of Signature Bank of New York and also the author of “In Good Faith: Questioning Religion and Atheism” (Post Hill Press, 2018).

Allowing Facebook to mint its own coin, the Libra, would turn it into the greatest anti-competitive trust case in history. It would make the early 20th century Morgans or Rockefellers seem downright competitive.

Even before it unveiled its vision for a global cryptocurrency this month, Facebook was already a near-monopoly in social media, and part of a duopoly in its main markets. Together with Google, it controls 82% of the digital advertising market. 

In the past, Facebook has purchased any company that threatened it, e.g. Instagram and WhatsApp. And, when it spots a company that won’t sell itself or would be difficult to purchase, it uses the “embrace, enhance and extinguish” technique.  

Facebook saw Snap Inc. (maker of Snapchat) contesting a small part of its franchise, so it embraced Snap’s best features and integrated them into its app. Now, Facebook is hoping to extinguish Snap as a competitor. Compare the stock performance of Snap and Facebook, and you will probably place your bet on Facebook.

But it is not simply Facebook’s business practices that are of concern.

Neither Facebook nor Google charges for their consumer products, obscuring the fact that all-encompassing consumer tracking is their real product. In many cases, their data is better than what the KGB or CIA could have gathered 20 years ago. And their data is certainly a lot cheaper, since it is voluntarily provided and easily accessible.

We would not want our government agencies to have this sort of power, nor should we want it to be in the hands of corporations. 

Facebook and Google have already shown their political muscle. With their duopoly on digital marketing advertising, these companies have transformed the nature of news.  Only a few news sites, such as The Wall Street Journal and The New York Times, can resist their gravitational pull and still attract direct advertisers as well as subscribers.

Most other publications must use Google ads, which provide far less revenue to the outlet, slice and dice their readership, and force newspapers to write clickbait. Ads to readers are so well-placed because of the mountain of information that can be inputted into their algorithms. The same holds true for news content viewed on Facebook.

Now, with the Libra project, Facebook wants to exponentially increase its monopolistic power by accessing unparalleled information about our consumer purchasing habits. If allowed to proceed with Libra, a company that knows your every mood and virtually controls the news you see will also have access to the deepest insights into your spending patterns.

Privacy threat

Of course, Facebook will speak piously about privacy controls and its concern for the consumer, yet it will still figure out a way to sell the data or others who buy the data will figure it out for them.

Furthermore, with the richness of the social media data Facebook consistently garners, even anonymized data can be recalibrated to distill specific individual-related information and preferences. Facebook, along with its other monopolist rent-seeking cohorts, such as eBay, Uber and Mastercard, all say they won’t do that. 

Quite frankly, there is zero reason to believe such promises. Their culture is based strictly on brand concerns and access to personal data. Additionally, hacks of social media are now so common that we are inured to them.

Consumers can have the benefit of a digital payment mechanism without allowing Facebook to gain more power. In the financial services sector, my institution, Signature Bank, was the first to introduce a 24/7 blockchain-enabled payment system. As one would expect, others, such as JPMorgan, are trying to follow suit and will no doubt be competitors someday.

Banks and financial institutions are limited in their access to, and transmission of, information, and for good reason. If Facebook, on the other hand, establishes Libra, no other competitor will have equal access to its data, and therefore, a chance at the consumer payment market.

In this way, Libra is in keeping with Facebook’s monopolistic business style.

Further, the information monopoly Facebook would possess will be similar to what the Chinese government possesses but needs the Great Firewall to execute. Monopolistic forces will produce the same result through different means.

Call to action

Action needs to be taken quickly to stop Libra and break up Big Tech, not only for the welfare of consumers but for the good of the nation.

The first step is to force Facebook to divest or spin off Instagram, WhatsApp, Instagram and Chainspace, the blockchain startup it acqui-hired early this year.

Facebook also must be mandated to offer a parallel, ad-free, “no collection of information” site supported by fee-based subscriptions. Over time, this would provide some transparency as to the value of the consumer information currently being gifted to Facebook.

Google should be forced to divest or spin off YouTube, Double Click and other advertising entities, cloud services and Android. Amazon similarly needs a radical breakup as it too poses systemic threats to a transparent market. (Alexa is a prime example of the private data Amazon gathers on users’ lifestyle and personal habits.)

The breakup of these behemoths cannot wait until after the 2020 election.  Such action must be taken on a bipartisan basis as soon as possible.

Even once stripped down, Facebook should remain separated from commerce due to privacy concerns. Congress, which has scheduled hearings on Libra for next month, is right to intervene.

 

from: https://www.coindesk.com/facebooks-libra-cryptocurrency-bad-for-privacy-bad-for-competition

 

 

SWIFT Gives Blockchain Platforms Access to ‘Instant’ GPI Payments Following R3 Trial

[needless to say: while it bears the names “DLT” and “Blockchain”, it has little to do with either]
The firm said 55 percent of SWIFT cross-border payments are now being made over GPI,

a payments flow worth over $40 trillion
Jun 24, 2019 at 13:30 UTC

Global interbank messaging giant SWIFT has revealed it will allow blockchain firms to make use of its Global Payments Innovation (GPI) platform for near real-time payments.

In a report published late last week, SWIFT said that, following a successful proof-of-concept with R3’s Corda platform, it would “soon be enabling gpi payments on DLT [distributed ledger technology]-based trade platforms.”

Saying that [SWIFT’s] GPI would resolve the “payment challenges” faced by DLT platforms, the firm explained that payments using the system would be initiated within trade workflows and be automatically sent on to the banking system.

Launched in early 2017, GPI was created as a set of business rules encoded on top of the firm’s [SWIFT’s] existing infrastructure as a means to increased speed, transparency and the traceability of transactions.

In the report, the firm said 55 percent of SWIFT cross-border payments are now being made over GPI, a payments flow worth over $40 trillion.

“Half of them are reaching end beneficiary customers within minutes, and practically all within 24 hours,” the report stated, further predicting that all cross-border SWIFT payments will be made over GPI “within two years.”

At the launch of the proof-of-concept back in January, SWIFT explained that the trial would connect the GPI Link gateway with R3’s Corda platform to monitor payment flows and support application programming interfaces (APIs), as well as SWIFT and ISO standards.

Commenting on the SWIFT news, Charley Cooper, managing director at R3, said:

“We’re proud to be pioneering this work with SWIFT on the GPI Link initiative and R3’s Corda Settler. SWIFT’s intention to expand blockchain access to its GPI Link is an important step towards enterprise blockchain adoption and Corda is already a leader in this space. The ability for firms utilising enterprise blockchain applications to settle off chain using existing, established and trusted payment networks, like SWIFT, allows firms to access the efficiency gains from blockchain while reducing the friction in crossing between on-chain and pre-existing payment systems.”

It’s also notable that R3 began testing its Corda Settler payments engine with XRP, the native cryptocurrency of Ripple, prompting a frisson of excitement among Ripple supporters. In any case, R3 has made clear from the start that the technology was always designed to be interoperable with a variety of payments systems.

 

from: https://www.coindesk.com/swift-gives-blockchain-platforms-access-to-instant-gpi-payments-following-r3-trial

 

 

An AI “Vaccine” Can Block Adversarial Attacks

Virtual Vaccine

For as smart as artificial intelligence systems seem to get, they’re still easily confused by hackers who launch so-called adversarial attacks — cyberattacks that trick algorithms into misinterpreting their training data, sometimes to disastrous ends.

In order to bolster AI’s defenses from these dangerous hacks, scientists at the Australian research agency CSIRO say in a press release they’ve created a sort of AI “vaccine” that trains algorithms on weak adversaries so they’re better prepared for the real thing — not entirely unlike how vaccines expose our immune systems to inert viruses so they can fight off infections in the future.

Get Your Shots

CSIRO found that AI systems like those that steer self-driving cars could easily be tricked into thinking that a stop sign on the side of the road was actually a speed limit sign, a particularly dangerous example of how adversarial attacks could cause harm.

The scientists developed a way to distort the training data fed into an AI system so that it isn’t as easily fooled later on, according to research presented at the International Conference on Machine Learning last week.

“We implement a weak version of an adversary, such as small modifications or distortion to a collection of images, to create a more ‘difficult’ training data set,” Richard Nock, head of machine learning at CSIRO, said in the press release. “When the algorithm is trained on data exposed to a small dose of distortion, the resulting model is more robust and immune to adversarial attacks.”

from: https://futurism.com/the-byte/ai-vaccine-block-adversarial-attacks

***

Researchers from CSIRO’s Data61, the data and digital specialist arm of Australia’s national science agency, have developed a world-first set of techniques to effectively ‘vaccinate’ algorithms against adversarial attacks, a significant advancement in machine learning research.

Algorithms ‘learn’ from the data they are trained on to create a machine learning model that can perform a given task effectively without needing specific instructions, such as making predictions or accurately classifying images and emails. These techniques are already used widely, for example to identify spam emails, diagnose diseases from X-rays, predict crop yields and will soon drive our cars.

While the technology holds enormous potential to positively transform our world, artificial intelligence and machine learning are vulnerable to adversarial attacks, a technique employed to fool machine learning models through the input of malicious data causing them to malfunction.

Dr Richard Nock, machine learning group leader at CSIRO’s Data61 said that by adding a layer of noise (i.e. an adversary) over an image, attackers can deceive machine learning models into misclassifying the image.

“Adversarial attacks have proven capable of tricking a machine learning model into incorrectly labelling a traffic stop sign as speed sign, which could have disastrous effects in the real world.

“Our new techniques prevent adversarial attacks using a process similar to vaccination,” Dr Nock said.

“We implement a weak version of an adversary, such as small modifications or distortion to a collection of images, to create a more ‘difficult’ training data set. When the algorithm is trained on data exposed to a small dose of distortion, the resulting model is more robust and immune to adversarial attacks,”

In a research paper accepted at the 2019 International Conference on Machine Learning (ICML), the researchers also demonstrate that the ‘vaccination’ techniques are built from the worst possible adversarial examples, and can therefore withstand very strong attacks.

Adrian Turner, CEO at CSIRO’s Data61 said this research is a significant contribution to the growing field of adversarial machine learning.

“Artificial intelligence and machine learning can help solve some of the world’s greatest social, economic and environmental challenges, but that can’t happen without focused research into these technologies.

“The new techniques against adversarial attacks developed at Data61 will spark a new line of machine learning research and ensure the positive use of transformative AI technologies,” Mr Turner said.

CSIRO recently invested AU$19M into an Artificial Intelligence and Machine Learning Future Science Platform, to target AI-driven solutions for areas including food security and quality, health and wellbeing, sustainable energy and resources, resilient and valuable environments, and Australian and regional security.

Data61 also led the development of an AI ethics framework for Australia, released by the Australian Government for public consultation in April 2019.

The research paper, Monge blunts Bayes: Hardness Results for Adversarial Training [pdf · 2mb] , was presented at ICML on 13 June in Los Angeles.

from: https://www.csiro.au/en/News/News-releases/2019/Researchers-develop-vaccine-against-attacks-on-machine-learning

***

click on the paper to open the local copy of the PDF

 

 

Der Preis (in $$) der persönlichen Daten in USA

Daten gegen kostenlose Nutzung.

Das ist kurz zusammengefasst der Deal auf den sich NutzerInnen sozialer Netzwerke einlassen. Laut einer Umfrage von NBC News/Wall Street Journal aus dem März 2019 finden 74 Prozent der US-Amerikaner, dass das kein fairer Handel ist. Auch in Deutschland dürfte eine entsprechende Umfrage ähnlich ausfallen.

Aber was wäre dann ein guter Deal? Dieser Frage ist das Meinungsforschungsunternehmen Morning Consult in einer Umfrage unter 2.200 Erwachsenen in den USA nachgegangen.

  • Für Informationen wie den vollen Namen oder das Einkaufsverhalten würden die Teilnehmer 50 US-Dollar aufrufen.
  • Für Bonitätswerte und Führerscheinnummer würden 300 beziehungsweise 500 US-Dollar fällig werden.
  • Für Passnummer oder biometrische Daten müssten Unternehmen 1.000 US-Dollar zahlen.

 

from: https://de.statista.com/infografik/18449/umfrage-zum-preis-fuer-personenbezogene-daten-in-den-usa/

 

 

Kodak Reveals New Blockchain-Based Document Management System: 40% Cost Savings

Kodak Services for Business unveiled a blockchain-based document management platform during a two-day conference in New York, according to a news release published on June 5.

The company says the technology enables businesses and governments to better manage sensitive documents and keep them secure — automating workflows and archiving to ensure records can be accessed in real time.

According to Kodak, this system can help organizations achieve cost savings of up to 40% by improving productivity and preventing the loss of information.

Other products showcased during the event included Scan Cloud, which enables users to process data wherever they are.

“Smart Cities” were also discussed at the conference — a concept where cutting-edge technology is used to improve infrastructure and services in urban areas.

The company has used blockchain to establish databases before, and recently forged a partnership with RYDE Holding to build an image rights platform to protect copyright and help photographers monetize their works. Known as KodakONE, a limited beta test reportedly generated more than $1 million in licensing claims.

Back in February 2018, Kodak was forced to delay the launch of its KodakCOIN cryptocurrency in order to evaluate the status of potential investors — a day before a planned initial coin offering was due to start.

Last month, the Polish-British fintech firm Billon secured a $2.1 million grant from the European Commission to further the development of its own blockchain document management system.

from: https://cointelegraph.com/news/kodak-reveals-new-blockchain-based-document-management-system

The real risk of Facebook’s Libra coin is crooked developers

They’ll steal your money, not just your data.

Everyone’s worried about Mark Zuckerberg controlling the next currency, but I’m more concerned about a crypto Cambridge Analytica.

Today Facebook announced Libra, its forthcoming stablecoin designed to let you shop and send money overseas with almost zero transaction fees. Immediately, critics started harping about the dangers of centralizing control of tomorrow’s money in the hands of a company with a poor track record of privacy and security.

Facebook anticipated this, though, and created a subsidiary called Calibra to run its crypto dealings and keep all transaction data separate from your social data. Facebook shares control of Libra with 27 other Libra Association founding members, and as many as 100 total when the token launches in the first half of 2020. Each member gets just one vote on the Libra council, so Facebook can’t hijack the token’s governance even though it invented it.

With privacy fears and centralized control issues at least somewhat addressed, there’s always the issue of security. Facebook naturally has a huge target on its back for hackers. Not just because Libra could hold so much value to steal, but because plenty of trolls would get off on screwing up Facebook’s currency. That’s why Facebook open-sourced the Libra Blockchain and is offering a prototype in a pre-launch testnet. This developer beta plus a bug bounty program run in partnership with HackerOne is meant to surface all the flaws and vulnerabilities before Libra goes live with real money connected.

Yet that leaves one giant vector for abuse of Libra: the developer platform.

“Essential to the spirit of Libra . . . the Libra Blockchain will be open to everyone: any consumer, developer, or business can use the Libra network, build products on top of it, and add value through their services. Open access ensures low barriers to entry and innovation and encourages healthy competition that benefits consumers,” Facebook explained in its white paper and Libra launch documents. It’s even building a whole coding language called Move for making Libra apps.

Apparently Facebook has already forgotten how allowing anyone to build on the Facebook app platform and its low barriers to “innovation” are exactly what opened the door for Cambridge Analytica to hijack 87 million people’s personal data and use it for political ad targeting.

But in this case, it won’t be users’ interests and birthdays that get grabbed. It could be hundreds or thousands of dollars’ worth of Libra currency that’s stolen. A shady developer could build a wallet that just cleans out a user’s account or funnels their coins to the wrong recipient, mines their purchase history for marketing data or uses them to launder money. Digital risks become a lot less abstract when real-world assets are at stake.

In the wake of the Cambridge Analytica scandal, Facebook raced to lock down its app platform, restrict APIs, more heavily vet new developers and audit ones that look shady. So you’d imagine the Libra Association would be planning to thoroughly scrutinize any developer trying to build a Libra wallet, exchange or other related app, right? “There are no plans for the Libra Association to take a role in actively vetting [developers],” Calibra’s head of product Kevin Weil surprisingly told me. “The minute that you start limiting it is the minute you start walking back to the system you have today with a closed ecosystem and a smaller number of competitors, and you start to see fees rise.”

That translates to “the minute we start responsibly verifying Libra app developers, things start to get expensive, complicated or agitating to cryptocurrency purists. That might hurt growth and adoption.”

You know what will hurt growth of Libra a lot worse? A sob story about some migrant family or a small business getting all their Libra stolen. And that blame is going to land squarely on Facebook, not some amorphous Libra Association.

 

Facebook’s own Calibra Wallet

 

Inevitably, some unsavvy users won’t understand the difference between Facebook’s own wallet app Calibra and any other app built for the currency. “Libra is Facebook’s cryptocurrency. They wouldn’t let me get robbed,” some will surely say. And on Calibra they’d be right. It’s a custodial wallet that will refund you if your Libra are stolen and it offers 24/7 customer support via chat to help you regain access to your account.

Yet the Libra Blockchain itself is irreversible. Outside of custodial wallets like Calibra, there’s no getting your stolen or mis-sent money back. There’s likely no customer support. And there are plenty of crooked crypto developers happy to prey on the inexperienced. Indeed, $1.7 billion in cryptocurrency was stolen last year alone, according to CypherTrace via CNBC. “As with anything, there’s fraud and there are scams in the existing financial ecosystem today . . .  that’s going to be true of Libra too. There’s nothing special or magical that prevents that,” says Weil, who concluded “I think those pros massively outweigh the cons.”

Until now, the blockchain world was mostly inhabited by technologists, except for when skyrocketing values convinced average citizens to invest in Bitcoin just before prices crashed. Now Facebook wants to bring its family of apps’ 2.7 billion users into the world of cryptocurrency. That’s deeply worrisome.

 

Facebook founder and CEO Mark Zuckerberg arrives to testify during a Senate Commerce, Science and Transportation Committee and Senate Judiciary Committee joint hearing about Facebook on Capitol Hill in Washington, DC, April 10, 2018. (Photo: SAUL LOEB/AFP/Getty Images)

 

Regulators are already bristling, but perhaps for the wrong reasons. Democrat Senator Sherrod Brown tweeted that “We cannot allow Facebook to run a risky new cryptocurrency out of a Swiss bank account without oversight.”

https://twitter.com/SenSherrodBrown/status/1141039013916303361

 

And French Finance Minister Bruno Le Maire told Europe 1 radio that Libra can’t be allowed to “become a sovereign currency.”

https://www.bloomberg.com/news/articles/2019-06-18/france-calls-for-central-bank-review-of-facebook-cryptocurrency

 

Most harshly, Rep. Maxine Waters issued a statement saying, “Given the company’s troubled past, I am requesting that Facebook agree to a moratorium on any movement forward on developing a cryptocurrency until Congress and regulators have the opportunity to examine these issues and take action.”

Yet Facebook has just one vote in controlling the currency, and the Libra Association preempted these criticisms, writing, “We welcome public inquiry and accountability. We are committed to a dialogue with regulators and policymakers. We share policymakers’ interest in the ongoing stability of national currencies.”

That’s why as lawmakers confer about how to regulate Libra, I hope they remember what triggered the last round of Facebook execs having to appear before Congress and Parliament. A totally open, unvetted Libra developer platform in the name of “innovation” over safety is a ticking time bomb. Governments should insist the Libra Association thoroughly audit developers and maintain the power to ban bad actors. In this strange new crypto world, the public can’t be expected to perfectly protect itself from Cambridge Analytica 2.$.

 

from: https://techcrunch.com/2019/06/18/libra-analytica/

 

 

 

Österreich bringt erste Blockchain-Briefmarke der Welt heraus

So sieht eine Papierbriefmarke aus. Links die normal verwendbare Briefmarke,
rechts der verdeckte Teil mit Zugangsdaten zu einer Ethereum-Wallet. (Screenshot: t3n)

Die erste Blockchain-Briefmarke der Welt gibt’s im Onchain-Shop der Österreichischen Post

Die Österreichische Post hat die erste Blockchain-Briefmarke der Welt auf den Markt gebracht. Unser Gastautor hat sich das etwas genauer angeschaut.

Die sogenannte Crypto Stamp besteht aus zwei Teilen. Bei dem ersten Teil handelt es sich um eine echte Briefmarke aus Papier mit einem Nennwert von 6,90 Euro. Dieser Teil der Briefmarke kann ganz normal verwendet werden. Zusätzlich erhält man beim Kauf aber auch einen zweiten Teil. Dieser besteht ebenfalls aus Papier und verfügt über zwei verdeckte Textfelder, die freigerubbelt werden können. Hinter den Rubbelflächen befinden sich Zugangsdaten zu einer Ethereum-Wallet – also einer digitalen Geldbörse auf der Ethereum-Blockchain. Innerhalb dieser digitalen Geldbörse liegt ein Token mit dem Namen „Crypto stamp Edition 1“ und mit einer einzigartigen Token-ID. Dieser einzigartige Token kann Philatelisten und Krypto-Enthusiasten als virtuelles Sammelgut dienen. Daneben befindet sich in der digitalen Geldbörse noch ein kleiner Betrag von 0,001666 Ether mit einem Gegenwert von etwa 40 Cent. Dieser Betrag der verbreiteten Kryptowährung soll dem Käufer der Marke vermutlich ermöglichen, Transaktionsgebühren innerhalb der Blockchain zu bezahlen, sollte er seine virtuelle Marke an eine andere Ethereum-Wallet übertragen wollen.

Die Crypto Stamp ist in einer Auflage von 150.000 Stück erschienen und kann ganz normal in Ladengeschäften und im Onlineshop der Österreichischen Post gekauft werden. Allerdings werden nicht alle Briefmarken normal verkauft. 500 Stück können exklusiv im sogenannten Onchain-Shop der Österreichischen Post erworben werden – also direkt in der Blockchain. Und auch an dieser Stelle wird es spannend.

Der Kauf im Onchain-Shop

Der Onchain-Shop kann ganz normal über einen Browser erreicht werden. Alternativ kann die Ethereum-Adresse 0xC5BA58b8362a25b1ddB59E2106910B6c324A5668 genutzt werden. Der Kauf im Onchain-Shop setzt voraus, dass der Käufer bereits über eine eigene digitale Ethereum-Geldbörse verfügt und sich in dieser die Kryptowährung Ether befindet.

Der eigentlich Kauf der Krypto-Briefmarke wird dann über einen Smart Contract innerhalb der Ethereum-Blockchain abgewickelt. Der Smart Contract enthält ein Regelwerk, das grundsätzlich folgender Logik folgt: Wenn ein Nutzer von seiner digitalen Geldbörse den aktuellen Gegenwert von 6,90 Euro in Ether an die Blockchain-Adresse des Smart Contracts sendet, sendet dieser im Gegenzug einen einzigartigen Token der Art „Crypto stamp Edition 1“ an die digitale Geldbörse des Käufers. Der Kauf ist damit abgeschlossen und das virtuelle Sammelgut auch bereits auf den Käufer übertragen. Das Besondere an dieser Art des Kaufs: Unmittelbar personenbezogene Daten mussten bis hierhin nicht ausgetauscht werden.

Die Versandanschrift wird direkt über die Blockchain mitgeteilt

Was dann allerdings noch fehlt, ist die Briefmarke in Papierform – denn diese kauft man auch im Onchain-Shop mit. Die Papiermarke wird weltweit versendet. Bemerkenswert ist, wie die Post an die Versandadresse des Käufers kommt: Der Käufer muss mit aktivierter digitaler Geldbörse den Onchain-Shop besuchen und seine Versandanschrift in ein Formular eintragen, das auf den ersten Blick einem normalen Webformular ähnelt. Allerdings werden die Daten nicht an einen Server der österreichischen Post gesendet, sondern direkt und unveränderlich in die Blockchain gespeichert. Es wird also direkt über die Blockchain kommuniziert. Damit an dieser Stelle keine Datenschutzverletzung entsteht, werden die Daten nicht im Klartext in der Blockchain gespeichert, sondern vor der Speicherung mit einem öffentlichen Schlüssel der Post verschlüsselt.

Sammeln, sammeln, sammeln

Außerhalb der Blockchain bietet der Kryptobereich der österreichischen Post im Hinblick auf alle 150.000 Briefmarken beziehungsweise deren digitalen Tokens noch weitere Möglichkeiten. So wird beispielsweise der digitale Werdegang der Tokens öffentlich einsehbar angezeigt. Es ist also möglich, einzusehen, in welcher digitalen Geldbörse welcher Token liegt, ob ein Verkauf des Tokens innerhalb der Blockchain erfolgte und wenn ja, an welche Adresse. Die gesamte Transaktionshistorie der Tokens ist somit für jedermann abrufbar. Was auf den ersten Blick aussieht wie ein Datenschutzverstoß, ist in der Ethereum-Blockchain ganz normal; die Blockchain dient als eine Art öffentliches Kassenbuch.

Um das Sammlerherz höher schlagen zu lassen, bietet die Website der Österreichischen Post die Möglichkeit, sich weitere Metadaten seiner digitalen Briefmarken im Internet anzeigen zu lassen. So existiert die virtuelle Krypto-Briefmarke in fünf unterschiedlichen Farben. Am seltensten ist mit 1.500 Stück die virtuell rote Marke.

 

Die virtuelle Version der Crypto stamp existiert in fünf verschiedenen Farben. (Screenshot: t3n)

 

Welche Farbe die jeweilige Marke hat, bestimmt sich über die in der Blockchain gespeicherte Token-ID. Leider werden die der Token-ID zugeordneten Metainformationen aber nicht in der Blockchain selbst gespeichert, sondern in einer zentralisierten Datenbank, die im Internet unter https://crypto.post.at/CS1/meta/1 zu erreichen ist – die Zahl am Ende der URL entspricht dabei der jeweiligen Token-ID. Die Gefahr für Sammler: Sollte die Österreichische Post die Seiten vom Netz nehmen, existiert diese Information grundsätzlich nicht mehr. Zudem könnten die Informationen jederzeit geändert werden. Dies widerspricht eigentlich dem Gedanken eines echten Cryptocollectibles und trübt die Sammelfreude etwas.

Dafür enthält der Smart Contract des Onchain-Shops aber noch eine interessante Funktion: Stehen im Onchain-Shop nur noch 100 Marken zum Verkauf, so wird der Preis dieser letzten Marken jeweils um den Faktor 1,08 erhöht. Das bedeutet konkret: Die letzte Crypto Stamp des Onchain-Shops müsste für ungefähr 13.000 Euro über die virtuelle Ladentheke gehen. Zumindest in der Theorie.

Also: Datenmüll oder wertvolles Sammelgut?

Die Antwort auf die Frage liegt im Auge des Betrachters. Für einen Preis 6,90 Euro bekommen Käufer zumindest einiges geboten. Und eins muss festgestellt werden: Die Österreichische Post hat keine Mühen gescheut. Denn jede Papierbriefmarke kommt aufgrund der Zugangsdaten als individuell bedrucktes Unikat mit einem virtuellen Gegenpart und wird für viele Käufer darüber hinaus das erste Mal die Möglichkeit eröffnen, auf die Kryptowährung Ether zuzugreifen. Wer an dieser Stelle Lust auf die ersten Schritte in der Blockchain bekommen hat, benötigt neben der Crypto Stamp nur eine Browsererweiterung, um Zugang zum Ethereum-Netzwerk zu erhalten. Sehr verbreitet ist hierfür das Plugin Metamask, das unter anderem für Chrome und Firefox verfügbar ist.

 

from: https://t3n.de/news/krypto-oesterreichische-post-1172058/

 

 

Aufwärtstrend bei Kryptowährungen im 1. Halbjahr 2019

Derzeit gibt es laut coinmarketcap.com 2.238 unterschiedliche Kryptowährungen mit einem Gesamtwert von über 280 Milliarden US-Dollar, von denen mehr als die Hälfte auf den Bitcoin entfallen. Nun hat Facebook seine eigene Digitalwährung, den”Libra”, vorgestellt. Da das größte soziale Netzwerk der Welt mit seinen 2,3 Milliarden aktiven Accounts jede Menge potentielle Nutzer mitbringt, wird dem Vorhaben eine große Bedeutung zugemessen. Noch vor wenigen Jahren waren Krypto-Coins allenfalls etwas für Internet-Nerds. Mitte 2013 waren gerade einmal 26 unterschiedliche Digitalwährung im Gesamtwert von 1,1 Milliarden US-Dollar aktiv.

https://de.statista.com/infografik/12186/unterschiedliche-krypto-coins-und-marktkapitalisierung/

 

 

275 KI-Startups in Deutschland im Überblick

275 Startups mit künstlicher Intelligenz (KI) als Kernelement des Geschäftsmodells haben die Analysten von Appanion Labs in Deutschland gezählt. Der mit Abstand bedeutendste KI-Standort ist Berlin mit 102 Startups gefolgt von München (50) und Hamburg (17). Etwas mehr als ein Drittel der Neugründungen haben keinen Branchenfokus, sondern bieten Lösungen für sämtliche Industrien an. Gemessen daran, dass KI in den Medien als eine der entscheidenden Zukunftstechnologien gehandelt wird, halten sich die Investitionen bislang in Grenzen. In für die Studie betrachteten Startups sind bislang aus Venture Capital und M&A rund 1,2 Milliarden Euro geflossen. Gemessen am Umsatzpotential von KI-Anwendungen ist das wenig (siehe unten).

from: https://de.statista.com/infografik/18389/ki-startups-in-deutschland-nach-staedten-und-branchen/

 

 

Ist künstliche Intelligenz (KI) die Zukunft? Darauf gibt es zwei Antworten: ja, vielleicht, wenn damit echtes intelligentes Verhalten gemeint ist und ja, ganz bestimmt, wenn damit Advanced Analytics und Machine Learning gemeint sind. Im letzteren Sinne beeinflussen KI-Anwendungen schon heute. Die Entwicklung von Industrien und Branche. Das geht aus einer Analyse des Startups Appanion hervor für die über 1.000 KI-Use Cases analysiert wurden. Schon im laufenden Jahr könnten bei rund 221 Milliarden Euro Umsatz in Deutschland künstliche Intelligenz im Spiel sein – davon allein 45,4 Milliarden Euro in der Automobilproduktion.

from: https://de.statista.com/infografik/16992/umsatz-der-in-deutschland-durch-ki-anwendungen-beeinflusst-wird/

 

 

By continuing to use this site, you agree to the use of cookies. Please consult the Privacy Policy page for details on data use. more information

The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.

Close