ترغب بنشر مسار تعليمي؟ اضغط هنا

Evolving Academia/Industry Relations in Computing Research

68   0   0.0 ( 0 )
 نشر من قبل Ben Zorn
 تاريخ النشر 2019
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

In 2015, the CCC co-sponsored an industry round table that produced the document The Future of Computing Research: Industry-Academic Collaborations. Since then, several important trends in computing research have emerged, and this document considers how those trends impact the interaction between academia and industry in computing fields. We reach the following conclusions: - In certain computing disciplines, such as currently artificial intelligence, we observe significant increases in the level of interaction between professors and companies, which take the form of extended joint appointments. - Increasingly, companies are highly motivated to engage both professors and graduate students working in specific technical areas because companies view computing research and technical talent as a core aspect of their business success. - There is also the further potential for principles and values from the academy (e.g., ethics, human-centered approaches, etc.) informing products and R&D roadmaps in new ways through these unique joint arrangements. - This increasing connection between faculty, students, and companies has the potential to change (either positively or negatively) numerous things, including: the academic culture in computing research universities, the research topics that faculty and students pursue, the ability of universities to train undergraduate and graduate students, etc. This report is the first step in engaging the broader computing research community, raising awareness of the opportunities, complexities and challenges of this trend but further work is required. We recommend follow-up to measure the degree and impact of this trend and to establish best practices that are shared widely among computing research institutions.



قيم البحث

اقرأ أيضاً

Computing devices are vital to all areas of modern life and permeate every aspect of our society. The ubiquity of computing and our reliance on it has been accelerated and amplified by the COVID-19 pandemic. From education to work environments to hea lthcare to defense to entertainment - it is hard to imagine a segment of modern life that is not touched by computing. The security of computers, systems, and applications has been an active area of research in computer science for decades. However, with the confluence of both the scale of interconnected systems and increased adoption of artificial intelligence, there are many research challenges the community must face so that our society can continue to benefit and risks are minimized, not multiplied. Those challenges range from security and trust of the information ecosystem to adversarial artificial intelligence and machine learning. Along with basic research challenges, more often than not, securing a system happens after the design or even deployment, meaning the security community is routinely playing catch-up and attempting to patch vulnerabilities that could be exploited any minute. While security measures such as encryption and authentication have been widely adopted, questions of security tend to be secondary to application capability. There needs to be a sea-change in the way we approach this critically important aspect of the problem: new incentives and education are at the core of this change. Now is the time to refocus research community efforts on developing interconnected technologies with security baked in by design and creating an ecosystem that ensures adoption of promising research developments. To realize this vision, two additional elements of the ecosystem are necessary - proper incentive structures for adoption and an educated citizenry that is well versed in vulnerabilities and risks.
By all measures, wireless networking has seen explosive growth over the past decade. Fourth Generation Long Term Evolution (4G LTE) cellular technology has increased the bandwidth available for smartphones, in essence, delivering broadband speeds to mobile devices. The most recent 5G technology is further enhancing the transmission speeds and cell capacity, as well as, reducing latency through the use of different radio technologies and is expected to provide Internet connections that are an order of magnitude faster than 4G LTE. Technology continues to advance rapidly, however, and the next generation, 6G, is already being envisioned. 6G will make possible a wide range of powerful, new applications including holographic telepresence, telehealth, remote education, ubiquitous robotics and autonomous vehicles, smart cities and communities (IoT), and advanced manufacturing (Industry 4.0, sometimes referred to as the Fourth Industrial Revolution), to name but a few. The advances we will see begin at the hardware level and extend all the way to the top of the software stack. Artificial Intelligence (AI) will also start playing a greater role in the development and management of wireless networking infrastructure by becoming embedded in applications throughout all levels of the network. The resulting benefits to society will be enormous. At the same time these exciting new wireless capabilities are appearing rapidly on the horizon, a broad range of research challenges loom ahead. These stem from the ever-increasing complexity of the hardware and software systems, along with the need to provide infrastructure that is robust and secure while simultaneously protecting the privacy of users. Here we outline some of those challenges and provide recommendations for the research that needs to be done to address them.
Industry 4.0, or Digital Manufacturing, is a vision of inter-connected services to facilitate innovation in the manufacturing sector. A fundamental requirement of innovation is the ability to be able to visualise manufacturing data, in order to disco ver new insight for increased competitive advantage. This article describes the enabling technologies that facilitate In-Transit Analytics, which is a necessary precursor for Industrial Internet of Things (IIoT) visualisation.
It is undeniable that the worldwide computer industrys center is the US, specifically in Silicon Valley. Much of the reason for the success of Silicon Valley had to do with Moores Law: the observation by Intel co-founder Gordon Moore that the number of transistors on a microchip doubled at a rate of approximately every two years. According to the International Technology Roadmap for Semiconductors, Moores Law will end in 2021. How can we rethink computing technology to restart the historic explosive performance growth? Since 2012, the IEEE Rebooting Computing Initiative (IEEE RCI) has been working with industry and the US government to find new computing approaches to answer this question. In parallel, the CCC has held a number of workshops addressing similar questions. This whitepaper summarizes some of the IEEE RCI and CCC findings. The challenge for the US is to lead this new era of computing. Our international competitors are not sitting still: China has invested significantly in a variety of approaches such as neuromorphic computing, chip fabrication facilities, computer architecture, and high-performance simulation and data analytics computing, for example. We must act now, otherwise, the center of the computer industry will move from Silicon Valley and likely move off shore entirely.
The catch-up effect and the Matthew effect offer opposing characterizations of globalization: the former predicts an eventual convergence as the poor can grow faster than the rich due to free exchanges of complementary resources, while the latter, a deepening inequality between the rich and the poor. To understand these effects on the globalization of research, we conduct an in-depth study based on scholarly and patent publications covering STEM research from 218 countries/regions over the past four decades, covering more than 55 million scholarly articles and 1.7 billion citations. Unique to this investigation is the simultaneous examination of both the research output and its impact in the same data set, using a novel machine learning based measure, called saliency, to mitigate the intrinsic biases in quantifying the research impact. The results show that the two effects are in fact co-occurring: there are clear indications of convergence among the high income and upper middle income countries across the STEM fields, but a widening gap is developing that segregates the lower middle and low income regions from the higher income regions. Furthermore, the rate of convergence varies notably among the STEM sub-fields, with the highly strategic area of Artificial Intelligence (AI) sandwiched between fields such as Medicine and Materials Science that occupy the opposite ends of the spectrum. The data support the argument that a leading explanation of the Matthew effect, namely, the preferential attachment theory, can actually foster the catch-up effect when organizations from lower income countries forge substantial research collaborations with those already dominant. The data resoundingly show such collaborations benefit all parties involved, and a case of role reversal can be seen in the Materials Science field where the most advanced signs of convergence are observed.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا