AMII Conference

AMII

Randy:

  • Where does cutting edge AI come from?
    • Carnegie Mellon University
  • Federal AI Strategy Development
  • Ranking of AI and ML: CMU, Tsinghua, University of Alberta, Cornell University, Technion, MIT, UT Austin, JKUST, Berkeley, UMass, UMich, UCLA, Stanford, etc;
    • China AI Strategy has moved Tsinghua from 10thto 2nd
  • AMii focuses on efforts of 4 areas
  • Pan-Canadian AI Strategy
    • $125 million across Canada to attract and retain academic talent
    • Recognizes Canada’s centers of AI excellence: Mila (Montreal), The Vector Institute (Toronto) and Amii (Edmonton)
    • First cohort of Canada CIFAR AI Chairs includes 29 researchers at institutions across Canada
  • Amii has diverse expertise in machine intelligence
    • Algorithmic game theory
    • Algorithms and theory
    • Bio/medical informatics
    • Dana mining & analysis, etc;
  • Machine Intelligence Achievements
    • Pioneer of reinforcement learning
    • First group to beat poker pros at heads-up no-limit texas hold’em
    • Solved game of checkers
    • Academic origins of AlphaGo and the Atari Game Project
    • Developed UCT algorithm at the heart of many advancements in games
    • Thailand National Innovation Award for Tuberculosis Diagnosis
    • System capable of passing Japanese Bar exam
    • Open-source community centered around adaptive prosthetic limbs
  • Amii explores through world-class research and training
    • Amii Explores: pushes the bounds of scientific knowledge by enabling world-class research and training and facilitating knowledge, technology and talent transfer from academia to industry
    • Amii Educates to boost intelligence literacy: courses, workshops & seminars
    • Amii provides a range of meetups, events and educational opportunities to help drive machine intelligence literacy for Alberta workers and businesses—build, reskill, upskill, and retain machine intelligence expertise in Alberta
      • Management level courses
      • Technical level certification
      • Bespoke curriculum development
    • Amii Innovates: guide businesses on path to machine intelligence adoption by helping them develop strategies, shift processes, and systems and build in-house knowledge and teams
  • Connect with amii:
  • Q&A:
    • Will we catch up to China? No
    • China, US, Israel, & Saudi Arabia lead the world on collecting data/information
  • Model-free RL Atari Game Learning
    • Google’s DeepMind Deep Q-learning learns how to play the game after watching the screen
  • Big Data Challenges
    • Square kilometer array
    • DNA transistor
    • Environmental monitoring fluxnet
    • How are you able to determine what to keep / throw away
  • NCAA Football Clusters
    • Challenge: figure out division in NCAA
    • Run a clustering algorithm on the schedule
    • Model visualization changes based on what information you are trying to extract
  • Diagnosing ADHD
    • Machine learning classifier uses a participant’s resting state fMRI scan to diagnose the individual into one of 3 categories: healthy control, ADHD combined type, ADHD inattentive type
    • Used participant’s personal characteristic data: collection age, gender, handedness, performance IQ, verbal IQ, and full scare IQ, FMRI data
  • Image-based glioma biopsy
    • Compute S-transform frequency spectrum for each pixel, the average over all directions to get 1D spectra for each tumor
    • Build multi-frequency classifier based on these labeled cases
    • In these measurement results, BLUE 30 co-deleted 1p/19q, GREEN 24 intact 1p/19q
    • Results: 95-96%
  • Optimizing water treatment control
  • Use machine learning to find separators in the data
  • Natural language & approximate semantics: use the amount of google searchers also in google translate
  • COLIEE Statute Law Challenge
    • Statute: if a person in order to allow a principal to escape imminent danger to the principal’s person, reputation or property, the Manager shall not be liable to compensate for damages…
    • Question: in cases where an individual rescues another person from getting hit by a car by…
  • Query case àcase law judgements àhow to pick the ones to be noticed and the lawyer will notice in the count
  • Why we need explanatory AI systems
  • Funny car cartoon on automation: “Does your car have any idea why my car pulled it over?”
  • MYCIN expert knowledge
  • Rich Sutton, weak versus strong methods
    • Methods that scale with computation are the future of AI
  • Learning a model to classify images
    • classifier to distinguish dogs and cats
  • Experiments
    • Word deletion experiments
    • Green words = positive
    • Building a model of language to explain
  • Summary
    • World leading research group
    • Highlights of game playing, reinforcement learning, NLP, XAI
    • Expanding capacity, building industrial collaboration function
    • Industrial projects with DeepMind, DeepMind Health, Google Brain, Mitsubishi, Borealis AI, Alberta Treasury Branch, VW Group
    • Eager to build sustainable industrial relationships to continually refresh “line of sight” to potential high impact applications
  • Q & A:
    • Generating automatic models instead of analytically building them
    • If you get 2 different explanations and they are contradictions, then what is the model you have to create to compromise the two

 

Edmonton’sArtificial Intelligence Ecosystem: Research & Business:

  • Path of how businesses are created revolving AI in Edmonton
  • AI is a transformative technology
  • AMII:
    • U of A’s chinook challenges
    • IBM’s deep blue defeats Kasparov
    • Reinforcement Learning Author at U of A (Sutton)
    • DeepMind’s AlphaGo beats Sedol
    • Deep Stack Beats Texas Hold’em (Bowling)
    • DeepMind Opens Lab in Edmonton
    • AI Can Pass Japanese Bar Exam
    • csranks.orgàselect AI AND ML
  • Government commits $100 million to AI companies
  • Albert’s AI/ML Business Strategy Plan: Research & Development, Talent, Infrastructure, Industry Solutions, Market Presence
  • Companies that focus on AI in Edmonton: scope AR, AltaML, testfire labs, serious: labs,zept, ATB financial, Promethean Labs, oneBridge, Telus, Jobber, Servus credit union, PCL, ZEPT, Vadu, NTWIST, Drugbank, frettable ,SAM, Health Gauge, Darkhorse Analytics
  • Four Key components to build a successful AI/ML company:
    • Technical talent & IP, product development & business skills, industry domain expertise, data = successful AI/ML companies
    • Alberta àarchitecture, engineering, & construction, health, financial services & insurance, energy & utilities, transportation & logistics
  • CEO of Japanese company needs to shut down a nuclear plant àsending in robots to disarm it
  • Trying to create an AI industry in the Edmonton area
  • A lot of those who get successful at AI have a HUGE amount of data
  • Why Edmonton?
    • Great place to live
    • Vibrant quality of life àlargest snowball life in the world
    • City of Entrepreneurs
    • Economic leadership
    • Growing innocation community
    • City best place in Canada for youth to work
    • Tops in housing affordability
    • Top ranked research and educational institutions
  • Bruce Alton: [email protected]
  • Daylin Breen: [email protected]

TIL #5

  1. Life: Your closest friends can be your greatest disappointments
  2. Life: Never trust anyone who makes promises too easily
  3. Life: You get more when you put in more of your time and effort (clubs, classes, friendships)
  4. Life: Have empathy for others
  5. Life: Find ways to motivate yourself and improve yourself other than from those around you
  6. Data Structures: Maps and topological sorts can be used to determine prerequisites

Privacy Debate: Government on Encryption for Social Media and Personal Technology to Prevent Terrorism, Gang Violence, and Crime

Privacy Debate: Government on Encryption for Social Media and Personal Technology to Prevent Terrorism, Gang Violence, and Crime

We currently live in an age where technology and social media has grown exponentially in the past decades and continues to become more and more prevalent in our daily lives. Although this has allowed for people and communities to collaborate and stay well-connected with each other over a multitude of platforms, it has also provided a convenient facet for militant groups and terrorist organizations to recruit members, communicate orders, and execute their malicious attacks. With so much personal data stored in the web, social media platforms, and our daily technology such as cell phones, tablets, and laptops, companies have put substantial effort in making sure all user information is safe–well encrypted and private. This is beneficial for the average citizen, but not when encryption also protects the malicious plans and information of terrorist and militant groups. Tensions rise between the government and private corporations; this touches upon an important debate between user privacy and encryption and government warrant. There are certain ethical questions we must ask ourselves regarding encryption and privacy: Should the government receive the warrant to have tech companies give them information on personal accounts and passwords in order to investigate crime and militant or terrorist groups? Regarding social media and electronic mail platforms, such as instagram, facebook, Gmail, and Twitter, should the government be able to see content posted in private accounts or private messages in emails? There is an ongoing debate on this issue, and touches upon the precedent of law enforcement: how far will government go to protect its citizens even if it meant access all personal accounts, technology, and social media and how will this impact the average citizen? Does this essentially mean that there would be complete transparency and no privacy– a 4th amendment right? How does this fall into the engineering code of ethics for private corporations?

The first argument calls into the careful mediation the negative effects of government intervention and warrant into private technology and social media accounts, and whether it is truly beneficial for the common good. According to Bob Lord, Yahoo’s Security Chief on the encryption debate, “there are human rights activists throughout the world who struggle to communicate freely, to organize and to share their thoughts because their governments are looking to control the telecom companies, and the phone companies.” (1). Other than human right activities, Lord states that there is a growing danger, ranging from “Eastern European criminal syndicates to foreign nation-states” that can hurt and steal information from thousands of people (1). The role of encryption holds importance in protecting those people when they try to communicate with their banks, with their doctors, with the government over tax issues without their information being stolen or used for malicious purposes.

Although the privacy and information of the general population should be conserved, there are still terrorists and militant groups that utilize these mediums to recruit and pass information that can potentially perpetrate the death of thousands of people as shown thrown violent attacks throughout history. In fact, Al-Qaeda has been noted as being one of the terror groups that uses social media the most extensively  to spread its global communications (2). A man named Mohammed Yazdani, who was a poor engineer from India, was able to easily join ISIS by logging into Twitter, searching the hashtags #ISIS and #Khilafa, and quickly making contact with an Islamic State recruiter (3). ISIS helped Yazdani recruit conspirators, locate weapon caches prepositioned around India, and attempt to manufacture explosives (3). In this case, would it be more reasonable for government to have media surveillance into private accounts? Wouldn’t it be acceptable to compromise the average citizen’s privacy if it meant saving millions of lives? Past simply social media, should the government have access into any technological device we own? In the infamous FBI-Apple encryption dispute, the FBI ordered Apple to unlock and disable the auto-erase function of an iPhone 5C owned by Syed Rizwan Farook, one of the shooters involved in the December 2015 San Bernardino attack that killed 14 people and critically injured 22 (4).  The two attackers had died four hours after the attack in a shootout with the police, having previously destroyed their personal phones. Despite having recovered Farook’s work phone, the government was at a loss, as it was locked with a four-digit password and programmed to delete all its data after ten failed attempts. In order to investigate the tragedy and identify anyone who was involved with the shooting in order to prevent future occurrences of such event, the police needed access into this phone, yet Apple refused, stating that unlocking the phone would open a backdoor and risk all its customers. Apple does present an important factor to consider, but the question needs to be asked as to what other options are there for law enforcement? The goals have been to build encryption as secure as possible, which means no built-in vulnerabilities and also means at some point, but also to maintain general welfare and prevent terrorist attacks and the spread of harm. These two mediums must be balanced.

According to the EEE Code of Ethics and NSP Code of Ethics of Engineering, one the first Rules of Practice of being an engineer is to “hold paramount the safety, health, and welfare of the public” (5). In my opinion, a protocol for the government to bypass encryption should be carefully set up with specific guidelines of when it is permitted so it does not prive into the private lives of average citizens. The level of potential harm must exceed a certain threshold for government intervention into private accounts and technology, and this threshold should be quantified and calculated weighing multiple factors to reduce subjectivity and bias. This will prevent law enforcement from getting entangled in citizen communications with their banks, with doctors, family, etc; and make sure that citizen privacy is kept for the most part. Until the threshold of potential harm is met, the government should not be able to access private accounts or technology, and the information would only ever be used to mitigate harm in the general public, and not to be commercialized or publicized in any way, following the 4th  rule of practice in the engineering code of ethics: “Engineers shall not disclose, without consent, confidential information concerning the business affairs or technical processes of any present or former client or employer, or public body on which they serve” (5). Implications to these practices should be considered–including defining what defines the greater good. For example, in a hypothetical scenarios, if a girl might was going to commit suicide or may be potentially be murdered by a stranger she was talking to online and her mom needs access to her phone–would the government intervene? Or is this a breach in privacy into private conservations? It is important to define strict lines, and follow the engineering code of ethics to benefit the safety, health, and welfare of the public. So in the end, the question is, what is the right thing for the largest number of people under the largest number of circumstances. That’s where the government should be able to access private accounts and personal technology–to provide the largest amount of common good.

Citations:

  1. https://www.npr.org/sections/alltechconsidered/2016/04/28/475883338/yahoos-security-chief-on-encryption-debate-what-is-the-greater-good
  2. https://www.npr.org/sections/alltechconsidered/2016/04/28/475883338/yahoos-security-chief-on-encryption-debate-what-is-the-greater-good
  3. https://safelab.socialwork.columbia.edu/sites/default/files/content/Multimodal_Social_Media_Analysis_for_Gang_Violence.pdf
  4. https://en.wikipedia.org/wiki/FBI%E2%80%93Apple_encryption_dispute
  5. https://www.ieee.org/about/corporate/governance/p7-8.html