销售热线:198-5307-5821
  技术支持
您当前所在位置:首页 > 技术支持

Why is it important for data scientists to seek transparency?


发布时间:2023-07-16 17:42:21 来源: http://gkfp.cn/

摘要:Transparency is essentially important in data science projects and machine learning programs, partly because of the complexity and sophistication that drives th

Transparency is essentially important in data science projects and machine learning programs, partly because of the complexity and sophistication that drives them — because these programs are “learning” (generating probabilistic results) rather than following predetermined linear programming instructions, and because as a result, it can be hard to understand how the technology is reaching conclusions. The “black box” problem of machine learning algorithms that are not fully explainable to human decision-makers is a big one in this field.

With that in mind, being able to master explainable machine learning or “explainable AI” will likely be a main focus in how companies pursue talent acquisition for a data scientist. Already DARPA, the institution that brought us the internet, is funding a multimillion-dollar study in explainable AI, trying to promote the skills and resources needed to create machine learning and artificial intelligence technologies that are transparent to humans.

One way to think about it is that there is often a “literacy stage” of talent development and a “hyperliteracy stage.” For a data scientist, the traditional literacy stage would be knowledge of how to put together machine learning programs and how to build algorithms with languages like Python; how to construct neural networks and work with them. The hyperliteracy stage would be the ability to master explainable AI, to provide transparency in the use of machine learning algorithms and to preserve transparency as these programs work toward their goals and the goals of their handlers.

Another way to explain the importance of transparency in data science is that the data sets that are being used keep becoming more sophisticated, and therefore more potentially intrusive into people’s lives. Another major driver of explainable machine learning and data science is the European General Data Protection Regulation that was recently implemented to try to curb unethical use of personal data. Using the GDPR as a test case, experts can see how the need to explain data science projects fits into privacy and security concerns, as well as business ethics.


    上一篇我们送上的文章是 有限状态机如何应用于人工智能? , _!在下一篇继续做详细介绍,如需了解更多,请持续关注。
本文由日本NEC锂电池中国营销中心于2023-07-16 17:42:21 整理发布。
转载请注明出处.
上一篇: 有限状态机如何应用于人工智能?
下一篇: Why is it important for data scientists to seek transparency?
最新资讯
相关信息
日本NEC锂电池
联系我们
地址:北京市朝阳区东方东路88号办公楼F座8-9层
联系人:余工
手机:198-5307-5821

公司简介|新闻中心|锂电池产品|技术支持|联系我们
版权所有:日本NEC锂电池-中国