Search results for: big data platforms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25401

Search results for: big data platforms

24981 Powering Connections: Synergizing Sales and Marketing for Electronics Engineering with Web Development.

Authors: Muhammad Awais Kiani, Abdul Basit Kiani, Maryam Kiani

Abstract:

Synergizing Sales and Marketing for Electronics Engineering with Web Development, explores the dynamic relationship between sales, marketing, and web development within the electronics engineering industry. This study is important for the power of digital platforms to connect with customers. Which increases brand visibility and drives sales. It highlights the need for collaboration between sales and marketing teams, as well as the integration of web development strategies to create seamless user experiences and effective lead generation. Furthermore, It also emphasizes the role of data analytics and customer insights in optimizing sales and marketing efforts in the ever-evolving landscape of electronics engineering. Sales and marketing play a crucial role in driving business growth, and in today's digital landscape, web development has become an integral part of these strategies. Web development enables businesses to create visually appealing and user-friendly websites that effectively showcase their products or services. It allows for the integration of e-commerce functionalities, enabling seamless online transactions. Furthermore, web development helps businesses optimize their online presence through search engine optimization (SEO) techniques, social media integration, and content management systems. This abstract highlights the symbiotic relationship between sales marketing in the electronics industry and web development, emphasizing the importance of a strong online presence in achieving business success.

Keywords: electronics industry, web development, sales, marketing

Procedia PDF Downloads 111
24980 Classification of Generative Adversarial Network Generated Multivariate Time Series Data Featuring Transformer-Based Deep Learning Architecture

Authors: Thrivikraman Aswathi, S. Advaith

Abstract:

As there can be cases where the use of real data is somehow limited, such as when it is hard to get access to a large volume of real data, we need to go for synthetic data generation. This produces high-quality synthetic data while maintaining the statistical properties of a specific dataset. In the present work, a generative adversarial network (GAN) is trained to produce multivariate time series (MTS) data since the MTS is now being gathered more often in various real-world systems. Furthermore, the GAN-generated MTS data is fed into a transformer-based deep learning architecture that carries out the data categorization into predefined classes. Further, the model is evaluated across various distinct domains by generating corresponding MTS data.

Keywords: GAN, transformer, classification, multivariate time series

Procedia PDF Downloads 126
24979 The Digital Divide: Examining the Use and Access to E-Health Based Technologies by Millennials and Older Adults

Authors: Delana Theiventhiran, Wally J. Bartfay

Abstract:

Background and Significance: As the Internet is becoming the epitome of modern communications, there are many pragmatic reasons why the digital divide matters in terms of accessing and using E-health based technologies. With the rise of technology usage globally, those in the older adult generation may not be as familiar and comfortable with technology usage and are thus put at a disadvantage compared to other generations such as millennials when examining and using E-health based platforms and technology. Currently, little is known about how older adults and millennials access and use e-health based technologies. Methods: A systemic review of the literature was undertaken employing the following three databases: (i) PubMed, (ii) ERIC, and (iii) CINAHL; employing the search term 'digital divide and generations' to identify potential articles. To extract required data from the studies, a data abstraction tool was created to obtain the following information: (a) author, (b) year of publication, (c) sample size, (d) country of origin, (e) design/methods, (f) major findings/outcomes obtained. Inclusion criteria included publication dates between the years of Jan 2009 to Aug 2018, written in the English language, target populations of older adults aged 65 and above and millennials, and peer reviewed quantitative studies only. Major Findings: PubMed provided 505 potential articles, where 23 of those articles met the inclusion criteria. Specifically, ERIC provided 53 potential articles, where no articles met criteria following data extraction. CINAHL provided 14 potential articles, where eight articles met criteria following data extraction. Conclusion: Practically speaking, identifying how newer E-health based technologies can be integrated into society and identifying why there is a gap with digital technology will help reduce the impact on generations and individuals who are not as familiar with technology and Internet usage. The largest concern of all is how to prepare older adults for new and emerging E-health technologies. Currently, there is a dearth of literature in this area because it is a newer area of research and little is known about it. The benefits and consequences of technology being integrated into daily living are being investigated as a newer area of research. Several of the articles (N=11) indicated that age is one of the larger factors contributing to the digital divide. Similarly, many of the examined articles (N=5) identify that privacy concerns were one of the main deterrents of technology usage for elderly individuals aged 65 and above. The older adult generation feels that privacy is one of the major concerns, especially in regards to how data is collected, used and possibly sold to third party groups by various websites. Additionally, access to technology, the Internet, and infrastructure also plays a large part in the way that individuals are able to receive and use information. Lastly, a change in the way that healthcare is currently used, received and distributed would also help attribute to the change to ensure that no generation is left behind in a technologically advanced society.

Keywords: digital divide, e-health, millennials, older adults

Procedia PDF Downloads 168
24978 MOOCs (E-Learning) Project Personnel Competency Analysis

Authors: Shang-Hua Wu, Rong-Chi Chang, Horng–Twu Liaw

Abstract:

Nowadays, competencies of e-learning project personnel are very important in assisting them in offering courses, serving students in an effective way, leveraging advantages, strengthen their relationships with potential students, etc. among e-learning platforms, MOOCs has recently attracted increasing focuses in distance education since it can be conducted for a large numbers of virtual learners. Nonetheless, since MOOCs is a relatively new e-learning platform, top concerns have been paid to what competencies are important for e-learning personnel to consider. Taking this need, this research aimed to carry out an in-depth exploration of competency requirements of MOOCs (e-learning) project personnel in Taiwan vocational schools. Data were collected through thorough literature reviews and discussions and competency analysis was carried out using Delphi technique questionnaires. The results show that that MOOCs (e-learning) project personnel’ professional competency lie in three main dimensions, among which ‘demand analysis competency’ (i.e., containing 10 major competences and 48 subordinate capabilities) is the most important competency, followed by ‘project management competency’ (i.e., comprising 6 major competences and 31 secondary capabilities), and finally ‘digital content production competency’ (i.e., including 12 major competences and 79 secondary capabilities). As such, in Taiwan context with different organizational scales and market sizes, the e-learning competency items and unique experience/ achievements throughout the promotion process obtained in this research will provide useful references for academic institutions in promoting e-learning.

Keywords: competency analysis, Delphi technique questionnaire, e-learning, massive open online courses

Procedia PDF Downloads 280
24977 Intelligent Control of Agricultural Farms, Gardens, Greenhouses, Livestock

Authors: Vahid Bairami Rad

Abstract:

The intelligentization of agricultural fields can control the temperature, humidity, and variables affecting the growth of agricultural products online and on a mobile phone or computer. Smarting agricultural fields and gardens is one of the best and best ways to optimize agricultural equipment and has a 100 percent direct effect on the growth of plants and agricultural products and farms. Smart farms are the topic that we are going to discuss today, the Internet of Things and artificial intelligence. Agriculture is becoming smarter every day. From large industrial operations to individuals growing organic produce locally, technology is at the forefront of reducing costs, improving results and ensuring optimal delivery to market. A key element to having a smart agriculture is the use of useful data. Modern farmers have more tools to collect intelligent data than in previous years. Data related to soil chemistry also allows people to make informed decisions about fertilizing farmland. Moisture meter sensors and accurate irrigation controllers have made the irrigation processes to be optimized and at the same time reduce the cost of water consumption. Drones can apply pesticides precisely on the desired point. Automated harvesting machines navigate crop fields based on position and capacity sensors. The list goes on. Almost any process related to agriculture can use sensors that collect data to optimize existing processes and make informed decisions. The Internet of Things (IoT) is at the center of this great transformation. Internet of Things hardware has grown and developed rapidly to provide low-cost sensors for people's needs. These sensors are embedded in IoT devices with a battery and can be evaluated over the years and have access to a low-power and cost-effective mobile network. IoT device management platforms have also evolved rapidly and can now be used securely and manage existing devices at scale. IoT cloud services also provide a set of application enablement services that can be easily used by developers and allow them to build application business logic. Focus on yourself. These development processes have created powerful and new applications in the field of Internet of Things, and these programs can be used in various industries such as agriculture and building smart farms. But the question is, what makes today's farms truly smart farms? Let us put this question in another way. When will the technologies associated with smart farms reach the point where the range of intelligence they provide can exceed the intelligence of experienced and professional farmers?

Keywords: food security, IoT automation, wireless communication, hybrid lifestyle, arduino Uno

Procedia PDF Downloads 51
24976 Instructional Design Strategy Based on Stories with Interactive Resources for Learning English in Preschool

Authors: Vicario Marina, Ruiz Elena, Peredo Ruben, Bustos Eduardo

Abstract:

the development group of Educational Computing of the National Polytechnic (IPN) in Mexico has been developing interactive resources at preschool level in an effort to improve learning in the Child Development Centers (CENDI). This work describes both a didactic architecture and a strategy for teaching English with digital stories using interactive resources available through a Web repository designed to be used in mobile platforms. It will be accessible initially to 500 children and worldwide by the end of 2015.

Keywords: instructional design, interactive resources, digital educational resources, story based English teaching, preschool education

Procedia PDF Downloads 469
24975 A Privacy Protection Scheme Supporting Fuzzy Search for NDN Routing Cache Data Name

Authors: Feng Tao, Ma Jing, Guo Xian, Wang Jing

Abstract:

Named Data Networking (NDN) replaces IP address of traditional network with data name, and adopts dynamic cache mechanism. In the existing mechanism, however, only one-to-one search can be achieved because every data has a unique name corresponding to it. There is a certain mapping relationship between data content and data name, so if the data name is intercepted by an adversary, the privacy of the data content and user’s interest can hardly be guaranteed. In order to solve this problem, this paper proposes a one-to-many fuzzy search scheme based on order-preserving encryption to reduce the query overhead by optimizing the caching strategy. In this scheme, we use hash value to ensure the user’s query safe from each node in the process of search, so does the privacy of the requiring data content.

Keywords: NDN, order-preserving encryption, fuzzy search, privacy

Procedia PDF Downloads 480
24974 Corporate Social Media: Understanding the Impact of Service Quality and Social Value on Customer Behavior

Authors: Regina Connolly, Murray Scott, William DeLone

Abstract:

Social media are revolutionary technologies that are transforming the way we communicate, the way we collaborate and the way we influence. Companies are making major investments in platforms such as Facebook and Twitter because they realize that social media are an influential force on customer perceptions and behavior. However, to date there is little guidance on what constitutes an effective deployment of social media and there is no empirical evidence that social medial investments are yielding positive returns. This research develops and validates the components of an effective corporate social media platform in order to examine the impact of effective social media on customer intentions and behavior.

Keywords: service quality, social value, social media, IS success, Web 2.0, customer behaviour

Procedia PDF Downloads 554
24973 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 83
24972 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 409
24971 Powering Profits: A Dynamic Approach to Sales Marketing and Electronics

Authors: Muhammad Awais Kiani, Maryam Kiani

Abstract:

This abstract explores the confluence of these two domains and highlights the key factors driving success in sales marketing for electronics. The abstract begins by digging into the ever-evolving landscape of consumer electronics, emphasizing how technological advancements and the growth of smart devices have revolutionized the way people interact with electronics. This paradigm shift has created tremendous opportunities for sales and marketing professionals to engage with consumers on various platforms and channels. Next, the abstract discusses the pivotal role of effective sales marketing strategies in the electronics industry. It highlights the importance of understanding consumer behavior, market trends, and competitive landscapes and how this knowledge enables businesses to tailor their marketing efforts to specific target audiences. Furthermore, the abstract explores the significance of leveraging digital marketing techniques, such as social media advertising, search engine optimization, and influencer partnerships, to establish brand identity and drive sales in the electronics market. It emphasizes the power of storytelling and creating captivating content to engage with tech-savvy consumers. Additionally, the abstract emphasizes the role of customer relationship management (CRM) systems and data analytics in optimizing sales marketing efforts. It highlights the importance of leveraging customer insights and analyzing data to personalize marketing campaigns, enhance customer experience, and ultimately drive sales growth. Lastly, the abstract concludes by underlining the importance of adapting to the ever-changing landscape of the electronics industry. It encourages businesses to embrace innovation, stay informed about emerging technologies, and continuously evolve their sales marketing strategies to meet the evolving needs and expectations of consumers. Overall, this abstract sheds light on the captivating realm of sales marketing in the electronics industry, emphasizing the need for creativity, adaptability, and a deep understanding of consumers to succeed in this rapidly evolving market.

Keywords: marketing industry, electronics, sales impact, e-commerce

Procedia PDF Downloads 69
24970 Advancing Early Intervention Strategies for United States Adolescents and Young Adults with Schizophrenia in the Post-COVID-19 Era

Authors: Peggy M. Randon, Lisa Randon

Abstract:

Introduction: The post-COVID-19 era has presented unique challenges for addressing complex mental health issues, particularly due to exacerbated stress, increased social isolation, and disrupted continuity of care. This article outlines relevant health disparities and policy implications within the context of the United States while maintaining international relevance. Methods: A comprehensive literature review (including studies, reports, and policy documents) was conducted to examine concerns related to childhood-onset schizophrenia and the impact on patients and their families. Qualitative and quantitative data were synthesized to provide insights into the complex etiology of schizophrenia, the effects of the pandemic, and the challenges faced by socioeconomically disadvantaged populations. Case studies were employed to illustrate real-world examples and areas requiring policy reform. Results: Early intervention in childhood is crucial for preventing or mitigating the long-term impact of complex psychotic disorders, particularly schizophrenia. A comprehensive understanding of the genetic, environmental, and physiological factors contributing to the development of schizophrenia is essential. The COVID-19 pandemic worsened symptoms and disrupted treatment for many adolescent patients with schizophrenia, emphasizing the need for adaptive interventions and the utilization of virtual platforms. Health disparities, including stigma, financial constraints, and language or cultural barriers, further limit access to care, especially for socioeconomically disadvantaged populations. Policy implications: Current US health policies inadequately support patients with schizophrenia. The limited availability of longitudinal care, insufficient resources for families, and stigmatization represent ongoing policy challenges. Addressing these issues necessitates increased research funding, improved access to affordable treatment plans, and cultural competency training for healthcare providers. Public awareness campaigns are crucial to promote knowledge, awareness, and acceptance of mental health disorders. Conclusion: The unique challenges faced by children and families in the US affected by schizophrenia and other psychotic disorders have yet to be adequately addressed on institutional and systemic levels. The relevance of findings to an international audience is emphasized by examining the complex factors contributing to the onset of psychotic disorders and their global policy implications. The broad impact of the COVID-19 pandemic on mental health underscores the need for adaptive interventions and global responses. Addressing policy challenges, improving access to care, and reducing the stigma associated with mental health disorders are crucial steps toward enhancing the lives of adolescents and young adults with schizophrenia and their family members. The implementation of virtual platforms can help overcome barriers and ensure equitable access to support and resources for all patients, enabling them to lead healthy and fulfilling lives.

Keywords: childhood, schizophrenia, policy, United, States, health, disparities

Procedia PDF Downloads 73
24969 Data Disorders in Healthcare Organizations: Symptoms, Diagnoses, and Treatments

Authors: Zakieh Piri, Shahla Damanabi, Peyman Rezaii Hachesoo

Abstract:

Introduction: Healthcare organizations like other organizations suffer from a number of disorders such as Business Sponsor Disorder, Business Acceptance Disorder, Cultural/Political Disorder, Data Disorder, etc. As quality in healthcare care mostly depends on the quality of data, we aimed to identify data disorders and its symptoms in two teaching hospitals. Methods: Using a self-constructed questionnaire, we asked 20 questions in related to quality and usability of patient data stored in patient records. Research population consisted of 150 managers, physicians, nurses, medical record staff who were working at the time of study. We also asked their views about the symptoms and treatments for any data disorders they mentioned in the questionnaire. Using qualitative methods we analyzed the answers. Results: After classifying the answers, we found six main data disorders: incomplete data, missed data, late data, blurred data, manipulated data, illegible data. The majority of participants believed in their important roles in treatment of data disorders while others believed in health system problems. Discussion: As clinicians have important roles in producing of data, they can easily identify symptoms and disorders of patient data. Health information managers can also play important roles in early detection of data disorders by proactively monitoring and periodic check-ups of data.

Keywords: data disorders, quality, healthcare, treatment

Procedia PDF Downloads 428
24968 Big Data and Analytics in Higher Education: An Assessment of Its Status, Relevance and Future in the Republic of the Philippines

Authors: Byron Joseph A. Hallar, Annjeannette Alain D. Galang, Maria Visitacion N. Gumabay

Abstract:

One of the unique challenges provided by the twenty-first century to Philippine higher education is the utilization of Big Data. The higher education system in the Philippines is generating burgeoning amounts of data that contains relevant data that can be used to generate the information and knowledge needed for accurate data-driven decision making. This study examines the status, relevance and future of Big Data and Analytics in Philippine higher education. The insights gained from the study may be relevant to other developing nations similarly situated as the Philippines.

Keywords: big data, data analytics, higher education, republic of the philippines, assessment

Procedia PDF Downloads 342
24967 Economic Valuation of Emissions from Mobile Sources in the Urban Environment of Bogotá

Authors: Dayron Camilo Bermudez Mendoza

Abstract:

Road transportation is a significant source of externalities, notably in terms of environmental degradation and the emission of pollutants. These emissions adversely affect public health, attributable to criteria pollutants like particulate matter (PM2.5 and PM10) and carbon monoxide (CO), and also contribute to climate change through the release of greenhouse gases, such as carbon dioxide (CO2). It is, therefore, crucial to quantify the emissions from mobile sources and develop a methodological framework for their economic valuation, aiding in the assessment of associated costs and informing policy decisions. The forthcoming congress will shed light on the externalities of transportation in Bogotá, showcasing methodologies and findings from the construction of emission inventories and their spatial analysis within the city. This research focuses on the economic valuation of emissions from mobile sources in Bogotá, employing methods like hedonic pricing and contingent valuation. Conducted within the urban confines of Bogotá, the study leverages demographic, transportation, and emission data sourced from the Mobility Survey, official emission inventories, and tailored estimates and measurements. The use of hedonic pricing and contingent valuation methodologies facilitates the estimation of the influence of transportation emissions on real estate values and gauges the willingness of Bogotá's residents to invest in reducing these emissions. The findings are anticipated to be instrumental in the formulation and execution of public policies aimed at emission reduction and air quality enhancement. In compiling the emission inventory, innovative data sources were identified to determine activity factors, including information from automotive diagnostic centers and used vehicle sales websites. The COPERT model was utilized to ascertain emission factors, requiring diverse inputs such as data from the national transit registry (RUNT), OpenStreetMap road network details, climatological data from the IDEAM portal, and Google API for speed analysis. Spatial disaggregation employed GIS tools and publicly available official spatial data. The development of the valuation methodology involved an exhaustive systematic review, utilizing platforms like the EVRI (Environmental Valuation Reference Inventory) portal and other relevant sources. The contingent valuation method was implemented via surveys in various public settings across the city, using a referendum-style approach for a sample of 400 residents. For the hedonic price valuation, an extensive database was developed, integrating data from several official sources and basing analyses on the per-square meter property values in each city block. The upcoming conference anticipates the presentation and publication of these results, embodying a multidisciplinary knowledge integration and culminating in a master's thesis.

Keywords: economic valuation, transport economics, pollutant emissions, urban transportation, sustainable mobility

Procedia PDF Downloads 54
24966 AutoML: Comprehensive Review and Application to Engineering Datasets

Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili

Abstract:

The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.

Keywords: automated machine learning, uncertainty, engineering dataset, regression

Procedia PDF Downloads 57
24965 A Framework of Virtualized Software Controller for Smart Manufacturing

Authors: Pin Xiu Chen, Shang Liang Chen

Abstract:

A virtualized software controller is developed in this research to replace traditional hardware control units. This virtualized software controller transfers motion interpolation calculations from the motion control units of end devices to edge computing platforms, thereby reducing the end devices' computational load and hardware requirements and making maintenance and updates easier. The study also applies the concept of microservices, dividing the control system into several small functional modules and then deploy into a cloud data server. This reduces the interdependency among modules and enhances the overall system's flexibility and scalability. Finally, with containerization technology, the system can be deployed and started in a matter of seconds, which is more efficient than traditional virtual machine deployment methods. Furthermore, this virtualized software controller communicates with end control devices via wireless networks, making the placement of production equipment or the redesign of processes more flexible and no longer limited by physical wiring. To handle the large data flow and maintain low-latency transmission, this study integrates 5G technology, fully utilizing its high speed, wide bandwidth, and low latency features to achieve rapid and stable remote machine control. An experimental setup is designed to verify the feasibility and test the performance of this framework. This study designs a smart manufacturing site with a 5G communication architecture, serving as a field for experimental data collection and performance testing. The smart manufacturing site includes one robotic arm, three Computer Numerical Control machine tools, several Input/Output ports, and an edge computing architecture. All machinery information is uploaded to edge computing servers and cloud servers via 5G communication and the Internet of Things framework. After analysis and computation, this information is converted into motion control commands, which are transmitted back to the relevant machinery for motion control through 5G communication. The communication time intervals at each stage are calculated using the C++ chrono library to measure the time difference for each command transmission. The relevant test results will be organized and displayed in the full-text.

Keywords: 5G, MEC, microservices, virtualized software controller, smart manufacturing

Procedia PDF Downloads 77
24964 Bringing Together Student Collaboration and Research Opportunities to Promote Scientific Understanding and Outreach Through a Seismological Community

Authors: Michael Ray Brunt

Abstract:

China has been the site of some of the most significant earthquakes in history; however, earthquake monitoring has long been the provenance of universities and research institutions. The China Digital Seismographic Network was initiated in 1983 and improved significantly during 1992-1993. Data from the CDSN is widely used by government and research institutions, and, generally, this data is not readily accessible to middle and high school students. An educational seismic network in China is needed to provide collaboration and research opportunities for students and engaging students around the country in scientific understanding of earthquake hazards and risks while promoting community awareness. In 2022, the Tsinghua International School (THIS) Seismology Team, made up of enthusiastic students and facilitated by two experienced teachers, was established. As a group, the team’s objective is to install seismographs in schools throughout China, thus creating an educational seismic network that shares data from the THIS Educational Seismic Network (THIS-ESN) and facilitates collaboration. The THIS-ESN initiative will enhance education and outreach in China about earthquake risks and hazards, introduce seismology to a wider audience, stimulate interest in research among students, and develop students’ programming, data collection and analysis skills. It will also encourage and inspire young minds to pursue science, technology, engineering, the arts, and math (STEAM) career fields. The THIS-ESN utilizes small, low-cost RaspberryShake seismographs as a powerful tool linked into a global network, giving schools and the public access to real-time seismic data from across China, increasing earthquake monitoring capabilities in the perspective areas and adding to the available data sets regionally and worldwide helping create a denser seismic network. The RaspberryShake seismograph is compatible with free seismic data viewing platforms such as SWARM, RaspberryShake web programs and mobile apps are designed specifically towards teaching seismology and seismic data interpretation, providing opportunities to enhance understanding. The RaspberryShake is powered by an operating system embedded in the Raspberry Pi, which makes it an easy platform to teach students basic computer communication concepts by utilizing processing tools to investigate, plot, and manipulate data. THIS Seismology Team believes strongly in creating opportunities for committed students to become part of the seismological community by engaging in analysis of real-time scientific data with tangible outcomes. Students will feel proud of the important work they are doing to understand the world around them and become advocates spreading their knowledge back into their homes and communities, helping to improve overall community resilience. We trust that, in studying the results seismograph stations yield, students will not only grasp how subjects like physics and computer science apply in real life, and by spreading information, we hope students across the country can appreciate how and why earthquakes bear on their lives, develop practical skills in STEAM, and engage in the global seismic monitoring effort. By providing such an opportunity to schools across the country, we are confident that we will be an agent of change for society.

Keywords: collaboration, outreach, education, seismology, earthquakes, public awareness, research opportunities

Procedia PDF Downloads 69
24963 Data Management and Analytics for Intelligent Grid

Authors: G. Julius P. Roy, Prateek Saxena, Sanjeev Singh

Abstract:

Power distribution utilities two decades ago would collect data from its customers not later than a period of at least one month. The origin of SmartGrid and AMI has subsequently increased the sampling frequency leading to 1000 to 10000 fold increase in data quantity. This increase is notable and this steered to coin the tern Big Data in utilities. Power distribution industry is one of the largest to handle huge and complex data for keeping history and also to turn the data in to significance. Majority of the utilities around the globe are adopting SmartGrid technologies as a mass implementation and are primarily focusing on strategic interdependence and synergies of the big data coming from new information sources like AMI and intelligent SCADA, there is a rising need for new models of data management and resurrected focus on analytics to dissect data into descriptive, predictive and dictatorial subsets. The goal of this paper is to is to bring load disaggregation into smart energy toolkit for commercial usage.

Keywords: data management, analytics, energy data analytics, smart grid, smart utilities

Procedia PDF Downloads 775
24962 Privacy Preserving Data Publishing Based on Sensitivity in Context of Big Data Using Hive

Authors: P. Srinivasa Rao, K. Venkatesh Sharma, G. Sadhya Devi, V. Nagesh

Abstract:

Privacy Preserving Data Publication is the main concern in present days because the data being published through the internet has been increasing day by day. This huge amount of data was named as Big Data by its size. This project deals the privacy preservation in the context of Big Data using a data warehousing solution called hive. We implemented Nearest Similarity Based Clustering (NSB) with Bottom-up generalization to achieve (v,l)-anonymity. (v,l)-Anonymity deals with the sensitivity vulnerabilities and ensures the individual privacy. We also calculate the sensitivity levels by simple comparison method using the index values, by classifying the different levels of sensitivity. The experiments were carried out on the hive environment to verify the efficiency of algorithms with Big Data. This framework also supports the execution of existing algorithms without any changes. The model in the paper outperforms than existing models.

Keywords: sensitivity, sensitive level, clustering, Privacy Preserving Data Publication (PPDP), bottom-up generalization, Big Data

Procedia PDF Downloads 291
24961 Investigating Online Literacy among Undergraduates in Malaysia

Authors: Vivien Chee Pei Wei

Abstract:

Today we live in a scenario in which letters share space with images on screens that vary in size, shape, and style. The popularization of television, then the computer and now the e-readers, tablets, and smartphones made the electronic assume the role that previously was restricted to printed materials. Since the extensive use of new technologies to produce, disseminate, collect and access electronic publications began, the changes to reading has been intensified. To be able to read online, it involves more than just utilizing specific skills, strategies, and practices, but also in negotiating multiple information sources. In this study, different perspectives of digital reading are being explored in order to define the key aspects of the term. The focus is to explore how new technologies affect how undergraduates’ reading behavior, which in turn, gives readers different reading levels and engagement with the text and other support materials in the same media. There is also the importance of the relationship between reading platforms, reading levels and formats of electronic publications. The study looks at the online reading practices of about 100 undergraduates from a local university. The data collected using the survey and interviews with the respondents are analyzed thematically. Findings from this study found that both digital and traditional reading are interrelated, and should not be viewed as separate, but complementary to each other. However, reading online complicates some of the skills required by traditional reading. Consequently, in order to successfully read and comprehend multiple sources of information online, undergraduates need regular opportunities to practice and develop their skills as part of their natural reading practices.

Keywords: concepts, digital reading, literacy, traditional reading

Procedia PDF Downloads 309
24960 A Fuzzy Kernel K-Medoids Algorithm for Clustering Uncertain Data Objects

Authors: Behnam Tavakkol

Abstract:

Uncertain data mining algorithms use different ways to consider uncertainty in data such as by representing a data object as a sample of points or a probability distribution. Fuzzy methods have long been used for clustering traditional (certain) data objects. They are used to produce non-crisp cluster labels. For uncertain data, however, besides some uncertain fuzzy k-medoids algorithms, not many other fuzzy clustering methods have been developed. In this work, we develop a fuzzy kernel k-medoids algorithm for clustering uncertain data objects. The developed fuzzy kernel k-medoids algorithm is superior to existing fuzzy k-medoids algorithms in clustering data sets with non-linearly separable clusters.

Keywords: clustering algorithm, fuzzy methods, kernel k-medoids, uncertain data

Procedia PDF Downloads 211
24959 Democracy Bytes: Interrogating the Exploitation of Data Democracy by Radical Terrorist Organizations

Authors: Nirmala Gopal, Sheetal Bhoola, Audecious Mugwagwa

Abstract:

This paper discusses the continued infringement and exploitation of data by non-state actors for destructive purposes, emphasizing radical terrorist organizations. It will discuss how terrorist organizations access and use data to foster their nefarious agendas. It further examines how cybersecurity, designed as a tool to curb data exploitation, is ineffective in raising global citizens' concerns about how their data can be kept safe and used for its acquired purpose. The study interrogates several policies and data protection instruments, such as the Data Protection Act, Cyber Security Policies, Protection of Personal Information(PPI) and General Data Protection Regulations (GDPR), to understand data use and storage in democratic states. The study outcomes point to the fact that international cybersecurity and cybercrime legislation, policies, and conventions have not curbed violations of data access and use by radical terrorist groups. The study recommends ways to enhance cybersecurity and reduce cyber risks using democratic principles.

Keywords: cybersecurity, data exploitation, terrorist organizations, data democracy

Procedia PDF Downloads 198
24958 Healthcare Data Mining Innovations

Authors: Eugenia Jilinguirian

Abstract:

In the healthcare industry, data mining is essential since it transforms the field by collecting useful data from large datasets. Data mining is the process of applying advanced analytical methods to large patient records and medical histories in order to identify patterns, correlations, and trends. Healthcare professionals can improve diagnosis accuracy, uncover hidden linkages, and predict disease outcomes by carefully examining these statistics. Additionally, data mining supports personalized medicine by personalizing treatment according to the unique attributes of each patient. This proactive strategy helps allocate resources more efficiently, enhances patient care, and streamlines operations. However, to effectively apply data mining, however, and ensure the use of private healthcare information, issues like data privacy and security must be carefully considered. Data mining continues to be vital for searching for more effective, efficient, and individualized healthcare solutions as technology evolves.

Keywords: data mining, healthcare, big data, individualised healthcare, healthcare solutions, database

Procedia PDF Downloads 62
24957 Summarizing Data Sets for Data Mining by Using Statistical Methods in Coastal Engineering

Authors: Yunus Doğan, Ahmet Durap

Abstract:

Coastal regions are the one of the most commonly used places by the natural balance and the growing population. In coastal engineering, the most valuable data is wave behaviors. The amount of this data becomes very big because of observations that take place for periods of hours, days and months. In this study, some statistical methods such as the wave spectrum analysis methods and the standard statistical methods have been used. The goal of this study is the discovery profiles of the different coast areas by using these statistical methods, and thus, obtaining an instance based data set from the big data to analysis by using data mining algorithms. In the experimental studies, the six sample data sets about the wave behaviors obtained by 20 minutes of observations from Mersin Bay in Turkey and converted to an instance based form, while different clustering techniques in data mining algorithms were used to discover similar coastal places. Moreover, this study discusses that this summarization approach can be used in other branches collecting big data such as medicine.

Keywords: clustering algorithms, coastal engineering, data mining, data summarization, statistical methods

Procedia PDF Downloads 357
24956 An Observational Study Assessing the Baseline Communication Behaviors among Healthcare Professionals in an Inpatient Setting in Singapore

Authors: Pin Yu Chen, Puay Chuan Lee, Yu Jen Loo, Ju Xia Zhang, Deborah Teo, Jack Wei Chieh Tan, Biauw Chi Ong

Abstract:

Background: Synchronous communication, such as telephone calls, remains the standard communication method between nurses and other healthcare professionals in Singapore public hospitals despite advances in asynchronous technological platforms, such as instant messaging. Although miscommunication is one of the most common causes of lapses in patient care, there is a scarcity of research characterizing baseline inter-professional healthcare communications in a hospital setting due to logistic difficulties. Objective: This study aims to characterize the frequency and patterns of communication behaviours among healthcare professionals. Methods: The one-week observational study was conducted on Monday through Sunday at the nursing station of a cardiovascular medicine and cardiothoracic surgery inpatient ward at the National Heart Centre Singapore. Subjects were shadowed by two physicians for sixteen hours or consecutive morning and afternoon nursing shifts. Communications were logged and characterized by type, duration, caller, and recipient. Results: A total of 1,023 communication events involving the attempted use of the common telephones at the nursing station were logged over a period of one week, corresponding to a frequency of one event every 5.45 minutes (SD 6.98, range 0-56 minutes). Nurses initiated the highest proportion of outbound calls (38.7%) via the nursing station common phone. A total of 179 face-to-face communications (17.5%), 362 inbound calls (35.39%), 481 outbound calls (47.02%), and 1 emergency alert (0.10%) were captured. Average response time for task-oriented communications was 159 minutes (SD 387.6, range 86-231). Approximately 1 in 3 communications captured aimed to clarify patient-related information. The total duration of time spent on synchronous communication events over one week, calculated from total inbound and outbound calls, was estimated to be a total of 7 hours. Conclusion: The results of our study showed that there is a significant amount of time spent on inter-professional healthcare communications via synchronous channels. Integration of patient-related information and use of asynchronous communication channels may help to reduce the redundancy of communications and clarifications. Future studies should explore the use of asynchronous mobile platforms to address the inefficiencies observed in healthcare communications.

Keywords: healthcare communication, healthcare management, nursing, qualitative observational study

Procedia PDF Downloads 209
24955 Access to Health Data in Medical Records in Indonesia in Terms of Personal Data Protection Principles: The Limitation and Its Implication

Authors: Anny Retnowati, Elisabeth Sundari

Abstract:

This research aims to elaborate the meaning of personal data protection principles on patient access to health data in medical records in Indonesia and its implications. The method uses normative legal research by examining health law in Indonesia regarding the patient's right to access their health data in medical records. The data will be analysed qualitatively using the interpretation method to elaborate on the limitation of the meaning of personal data protection principles on patients' access to their data in medical records. The results show that patients only have the right to obtain copies of their health data in medical records. There is no right to inspect directly at any time. Indonesian health law limits the principle of patients' right to broad access to their health data in medical records. This restriction has implications for the reduction of personal data protection as part of human rights. This research contribute to show that a limitaion of personal data protection may abuse the human rights.

Keywords: access, health data, medical records, personal data, protection

Procedia PDF Downloads 84
24954 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 351
24953 Analysis and Forecasting of Bitcoin Price Using Exogenous Data

Authors: J-C. Leneveu, A. Chereau, L. Mansart, T. Mesbah, M. Wyka

Abstract:

Extracting and interpreting information from Big Data represent a stake for years to come in several sectors such as finance. Currently, numerous methods are used (such as Technical Analysis) to try to understand and to anticipate market behavior, with mixed results because it still seems impossible to exactly predict a financial trend. The increase of available data on Internet and their diversity represent a great opportunity for the financial world. Indeed, it is possible, along with these standard financial data, to focus on exogenous data to take into account more macroeconomic factors. Coupling the interpretation of these data with standard methods could allow obtaining more precise trend predictions. In this paper, in order to observe the influence of exogenous data price independent of other usual effects occurring in classical markets, behaviors of Bitcoin users are introduced in a model reconstituting Bitcoin value, which is elaborated and tested for prediction purposes.

Keywords: big data, bitcoin, data mining, social network, financial trends, exogenous data, global economy, behavioral finance

Procedia PDF Downloads 352
24952 Parallel Pipelined Conjugate Gradient Algorithm on Heterogeneous Platforms

Authors: Sergey Kopysov, Nikita Nedozhogin, Leonid Tonkov

Abstract:

The article presents a parallel iterative solver for large sparse linear systems which can be used on a heterogeneous platform. Traditionally, the problem of solving linear systems does not scale well on multi-CPU/multi-GPUs clusters. For example, most of the attempts to implement the classical conjugate gradient method were at best counted in the same amount of time as the problem was enlarged. The paper proposes the pipelined variant of the conjugate gradient method (PCG), a formulation that is potentially better suited for hybrid CPU/GPU computing since it requires only one synchronization point per one iteration instead of two for standard CG. The standard and pipelined CG methods need the vector entries generated by the current GPU and other GPUs for matrix-vector products. So the communication between GPUs becomes a major performance bottleneck on multi GPU cluster. The article presents an approach to minimize the communications between parallel parts of algorithms. Additionally, computation and communication can be overlapped to reduce the impact of data exchange. Using the pipelined version of the CG method with one synchronization point, the possibility of asynchronous calculations and communications, load balancing between the CPU and GPU for solving the large linear systems allows for scalability. The algorithm is implemented with the combined use of technologies: MPI, OpenMP, and CUDA. We show that almost optimum speed up on 8-CPU/2GPU may be reached (relatively to a one GPU execution). The parallelized solver achieves a speedup of up to 5.49 times on 16 NVIDIA Tesla GPUs, as compared to one GPU.

Keywords: conjugate gradient, GPU, parallel programming, pipelined algorithm

Procedia PDF Downloads 159