ࡱ> EGD bjbj 4>p~_p~_@@lZ!4nnnnnIIIH!J!J!J!J!J!J!$#&\n!IIIIIn!nn!InnH!IH!Vh@pX/_ 4!!0!T'u.''dIIIIIIIn!n!:III!IIII'IIIIIIIII@` : ŰŮ Integrative Seminar 300-308-DW Research Topic Selection: Artificial Intelligence The list below includes good potential topics for your Comprehensive Assessment paper. You do not need to choose from this list, but if you do not, it is a good idea to speak to me to confirm that your topic will work. Remember that your topic must be related to Artificial Intelligence and that you will have to examine it from the perspectives of three social science disciplines. The list is divided in themes and fields. One way to do it is to select one theme and one field, for instance: The Ethics of AI in Law Enforcement. Themes Ethics: computers are now taking on social responsibilities, helping determine who gets credit, who can be released early from prison, who committed a traffic violation, etc. Machine learning is designed in such a manner that often the more precise it is, the less engineers can explain how AI decisions are made. How can we determine the right balance between efficiency and transparency? Can this be made safely? What happens when machines reach super-intelligence, or general-AI? Could this threaten us? Who is responsible for machine biases and mistakes? What about if a killer-robot makes a mistake? If machines reach human levels of intelligence, should they gain rights and moral status? Bias: Algorithms are designed by humans and depend on data. As a result, they can carry the biases of their creators, or those that are inherent to the datasets that feed algorithms. In the USA, some credit score ratings have been found to be biased against individuals living in historically Black or Latino neighborhoods. Predictive policing based on biased historical law enforcement data can lead to the over-policing of already vulnerable neighbourhoods. Even more problematic is that these biases can be swift under the rug because AI is supposedly neutral, meaning AI can accentuates group inequalities with the imprimatur of neutrality. Privacy: From tracking algorithms integrated on phone apps to smart homes, the amount of information private corporations such as Facebook or Amazon have on you is unprecedented. The same is true of the government. In law enforcement for instance, the police widely adopted dragnet surveillance practices enhancing drastically the data it collects on its citizens. Facial recognition is particularly controversial. The use of personal data in AI poses threats of unlawful or undesired decisions by algorithms, with drastic consequences for individuals. Sensitive AI privacy legislation include transparency, risk assessment, explainability, and predictability requirements. Regulation: Early AI policy discussions focused on how governments could give an edge to their national industries. It is in this context that Canada adopted the first national AI strategy in 2017. In 2022, privacy, bias, and ethical dimensions are now at the forefront of the policy debate. Legislators have a role to play in determining what type of data private and public actors have access to, and how to limit potential negative consequences of AI. Algorithmic biases in credit scores can be compensated with thorough auditing measures that need the implication of AI experts and social scientists. Government legislation can put thresholds on data stewardship, transparency, governance, and collection (the European Union is a leading actor in these fields). Another big theme linked to AI regulation are GAFAMs. Leveraging AI and personal data has been a key element helping these mega corporations grow in the last decade. Should private actors know this much about us? Should governments do something about it? Fields Healthcare: You have a Fitbit or an Apple Watch? Then you already benefit from the advances of AI in healthcare. Other promises include more efficient triage to avoid biases, the early detection of diseases, robots that could perform complex surgeries, robots that could accompany the elderly in elderly care houses, etc. Most biomedical companies are already heavily reliant of AI, which is itself dependent on patient health information. This poses privacy concerns, especially when governments are considering sharing Canadian health information with AI companies (regulation). In terms of Bias, data overly trained on one set of patients (usually Caucasian) could lead to costly medical mistake with members of other groups. Who would be responsible for such a mistake (ethics): the doctor, the AI-provider, the government? Law Enforcement: As the police enact the State monopoly of legitimate violence over a given territory, the way it engages with technological innovations to enhance this power - or not -, and how society responds, are crucial dynamics illustrative of the challenges AI poses for policy makers (regulation). Why are algorithmic biases more concerning in policing than in other policy areas? What are the privacy implications of police dragnet surveillance practices? Is police accountability possible in the era of big data policing? Should we let the police be equipped with Automatic Licence Plate Readers, facial recognition software, AI-induced Body-Worn Cameras? Is predictive policing neutral? Who would be responsible for a death by police robot (ethics) ? Economy: Banks led the AI turn by introducing chatbots to the world, created credit rating algorithms, AI-induced stock market trading, and fraud detection mechanisms (notably leveraging facial recognition technologies / privacy). AI gives a comparative advantage to its adopters, and as such has led to an enhancement of inequalities, both inside countries and on a north-south frame. What is the impact of AI on the labor market? What type of job are replaced (and who occupies them / bias)? How can we ensure society benefits from these changes (regulation / ethics)? E-Commerce bankrupt hundreds of small businesses, but also created other work opportunities. What should we do with people that are being replaced by AI? Should governments help businesses not getting swallowed by Amazon? .=FHI[`abc   = > ? @ l  ǹ񬬟xxxk^hFh{OJQJ^JhFhOJQJ^JhFhF)OJQJ^JhFh4fOJQJ^JhFh%@OJQJ^JhFh*OJQJ^JhFhG 5OJQJ\^JhFhOJQJ^JhFh4fOJQJ^J?t     & * - 6 8 9 : > @ J a b e f t  )-4;<@JX_ahjno|˾hFhfuOJQJ^JhFh<7OJQJ^JhFhL@1OJQJ^JhFh4fOJQJ^JhFh4f>*OJQJ^JhFh(_dOJQJ^JhFhWOJQJ^JhFh OJQJ^J; *01=LTZdfglny}    &2:KL˾˾˱˾˾hFhS\OJQJ^JhFhOJQJ^JhFh'aOJQJ^JhFh(_d>*OJQJ^JhFh4fOJQJ^JhFhfuOJQJ^JBLQVZs,.7AQ\^_`fy{   !*,0;AEUacdeٿٿhFh(_dOJQJ^JhFh>OJQJ^JhFhKOJQJ^JhFhOJQJ^JhFhS\OJQJ^JhFh OJQJ^JCegoqvwy  *+.45Y]mz-8WbchpqzƵƤhFhfuOJQJ^JhFh OJQJ^J hFhKOJQJ^JmH sH  hFh OJQJ^JmH sH  hFh>OJQJ^JmH sH hFhL@1OJQJ^JhFh>OJQJ^JhFh>>*OJQJ^J6),68AQTbclp  <ObcdjkluwɹhFh$ OJQJ^JhFhGOJQJ^JhFhkh>*OJQJ^JhFhG 5OJQJ\^JhFhkh5OJQJ\^JhFh OJQJ^JhFh>OJQJ^JhFhOJQJ^J9 $-189;?IPfy".56>t}~ *+湹hFh$ 5OJQJ\^JhFhG5OJQJ\^JhFhL@1OJQJ^JhFh$ OJQJ^JhFhGOJQJ^JI+/56@DHMNagoqwx56Ǻmm&hFhL5OJQJ\^JmH sH  hFhLOJQJ^JmH sH hFhLOJQJ^JhFh=bOJQJ^JhFhG >*OJQJ^JhFhGOJQJ^JhFh(_d>*OJQJ^JhFh$ >*OJQJ^JhFh$ 5OJQJ\^JhFh$ OJQJ^J*k $`a$gd; $`a$gd$a$gdL $`a$gdG$^`a$gdL -.;B|}#%'0156:;ACEKYZacʶ者hFhOJQJ^JhFh+OJQJ^JhFhG >*OJQJ^JhFh=bOJQJ^J&hFhg?5OJQJ\^JmH sH  hFhg?OJQJ^JmH sH &hFhL5OJQJ\^JmH sH  hFhLOJQJ^JmH sH /  +-17@FKL`u{ &'GLSTUborİğՎ hFh+OJQJ^JmH sH  hFh &OJQJ^JmH sH &hFh=b5OJQJ\^JmH sH  hFh=bOJQJ^JmH sH  hFhOJQJ^JmH sH hFhOJQJ^JhFhL@1OJQJ^J3 #;=TZrvxyzʶʑʑ hFhg?OJQJ^JmH sH  hFh;OJQJ^JmH sH &hFhg?5OJQJ\^JmH sH &hFh=b5OJQJ\^JmH sH  hFh=bOJQJ^JmH sH  hFh+OJQJ^JmH sH &hFh+5OJQJ\^JmH sH . hFh;OJQJ^JmH sH  hFh &OJQJ^JmH sH 21h:p. A!"#$% x666666666vvvvvvvvv666666>6666666666666666666666666666666666666666666666666hH6666666666666666666666666666666666666666666666666666666666666666666866666662 0@P`p2( 0@P`p 0@P`p 0@P`p 0@P`p 0@P`p(8HX`~ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@ 0@66666 OJPJQJ_HmH nH sH tH @`@ NormalCJ_HaJmH sH tH :A : Police par dfautViV 0Tableau Normal4 l4a 2k 2 0 Aucune liste PK![Content_Types].xmlN0EH-J@%ǎǢ|ș$زULTB l,3;rØJB+$G]7O٭V' B z]؀WWڣh #CZ&biHĀAo@޼9}oϟ>e6Ve80?/?w/N^ /ÊsS}Wg~O/w8E@痓腈;q P,-Z)ȁcێ9^rd<ġAYcθ \{I'牉{бk-/ XREXXzcXB,vȀ3F{B:"NHߊ\hD ۲MWgԵ&>7uajJ$\*{( ]$S>0q-!k .5Ҍ:l$dҹ3M6n(]&S1E!.x;D݃Pݏ }~6x֤zp/afowJGRÉ3:I`!ޣO lb<'}?]uٱc,a}ұٮq+%;UdwY[jɇ4Qb#릧i}Oj?t2NƇ㦓\O'7/רGD+O}FҮR'a>)$KU` peW5-Iͯl!dhRGph bGw.D2ΧZ`%pw7x}B\-*oԚCòDBa]ktIfL)\򌎾l` 5L2+@XӨV5ͪFae=_HYHyʹ,f0/ Z7XM 9ͬi^L\'dU Qu/P jd5x9 =k|PH0~evnYpNW 04a y4Mڕٚ#h'҆PK! ѐ'theme/theme/_rels/themeManager.xml.relsM 0wooӺ&݈Э5 6?$Q ,.aic21h:qm@RN;d`o7gK(M&$R(.1r'JЊT8V"AȻHu}|$b{P8g/]QAsم(#L[PK-![Content_Types].xmlPK-!֧6 0_rels/.relsPK-!kytheme/theme/themeManager.xmlPK-!^theme/theme/theme1.xmlPK-! ѐ' theme/theme/_rels/themeManager.xml.relsPK]  > t Le+k8@0(  B S  ?2 @ 3 8dz  UUgH 6b6t'X2& IrG^`CJOJQJo(o^`CJOJQJo(opp^p`CJOJQJo(o@ @ ^@ `CJOJQJo(o^`CJOJQJo(o^`CJOJQJo(o^`CJOJQJo(o^`CJOJQJo(oPP^P`CJOJQJo(o ^`OJQJo("  ^`OJQJo("  pp^p`OJQJo("  @ @ ^@ `OJQJo("  ^`OJQJo("  ^`OJQJo("  ^`OJQJo("  ^`OJQJo("  PP^P`OJQJo("  ^`OJQJo("  ^`OJQJo(  pp^p`OJQJo("  @ @ ^@ `OJQJo("  ^`OJQJo("  ^`OJQJo("  ^`OJQJo("  ^`OJQJo("  PP^P`OJQJo("  I6t'gHZl, D&a8pBd"*PLB*tLY E[Ȣ6\}.eV q74#4 #hD14 #hokA4 h E4 #h;a=\t4 hF~q7 #hSR lX8~$ *  H= K;KtrG % &F)0L@1q$3>n3v7z+<=<<%@!FGqHJd`KL$YQ<]U'a:1b=b(_dcgkhS\4fxq 0kzfu2u+\Z UFZu{|n$@@Unknown G.[x Times New Roman5^Symbol3. *Cx Arial7.* Calibri7@CambriaG=  jMS Mincho-3 fgC.,*{$ Calibri Light?= *Cx Courier NewA$BCambria Math qh * *20 K#QAP $P'OX2!xxg}  K WolfsonMichel F. Simard   Oh+'0h  $ 0 <HPX`' K Wolfson Normal.dotmMichel F. Simard2Microsoft Office Word@@K/@K/ ՜.+,0 hp|  '*   TitreTitle !"#$%&'()*+,-./012356789:;=>?@ABCFRoot Entry FgX/H1Table ''WordDocument4>(48<CompObjr  F Document Microsoft Word 97-2003 MSWordDocWord.Document.89q