Search results for: internet data science
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27592

Search results for: internet data science

25102 Collision Theory Based Sentiment Detection Using Discourse Analysis in Hadoop

Authors: Anuta Mukherjee, Saswati Mukherjee

Abstract:

Data is growing everyday. Social networking sites such as Twitter are becoming an integral part of our daily lives, contributing a large increase in the growth of data. It is a rich source especially for sentiment detection or mining since people often express honest opinion through tweets. However, although sentiment analysis is a well-researched topic in text, this analysis using Twitter data poses additional challenges since these are unstructured data with abbreviations and without a strict grammatical correctness. We have employed collision theory to achieve sentiment analysis in Twitter data. We have also incorporated discourse analysis in the collision theory based model to detect accurate sentiment from tweets. We have also used the retweet field to assign weights to certain tweets and obtained the overall weightage of a topic provided in the form of a query. Hadoop has been exploited for speed. Our experiments show effective results.

Keywords: sentiment analysis, twitter, collision theory, discourse analysis

Procedia PDF Downloads 535
25101 Improving Fingerprinting-Based Localization System Using Generative AI

Authors: Getaneh Berie Tarekegn, Li-Chia Tai

Abstract:

With the rapid advancement of artificial intelligence, low-power built-in sensors on Internet of Things devices, and communication technologies, location-aware services have become increasingly popular and have permeated every aspect of people’s lives. Global navigation satellite systems (GNSSs) are the default method of providing continuous positioning services for ground and aerial vehicles, as well as consumer devices (smartphones, watches, notepads, etc.). However, the environment affects satellite positioning systems, particularly indoors, in dense urban and suburban cities enclosed by skyscrapers, or when deep shadows obscure satellite signals. This is because (1) indoor environments are more complicated due to the presence of many objects surrounding them; (2) reflection within the building is highly dependent on the surrounding environment, including the positions of objects and human activity; and (3) satellite signals cannot be reached in an indoor environment, and GNSS doesn't have enough power to penetrate building walls. GPS is also highly power-hungry, which poses a severe challenge for battery-powered IoT devices. Due to these challenges, IoT applications are limited. Consequently, precise, seamless, and ubiquitous Positioning, Navigation and Timing (PNT) systems are crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarms, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.

Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine

Procedia PDF Downloads 42
25100 A Formal Approach for Instructional Design Integrated with Data Visualization for Learning Analytics

Authors: Douglas A. Menezes, Isabel D. Nunes, Ulrich Schiel

Abstract:

Most Virtual Learning Environments do not provide support mechanisms for the integrated planning, construction and follow-up of Instructional Design supported by Learning Analytic results. The present work aims to present an authoring tool that will be responsible for constructing the structure of an Instructional Design (ID), without the data being altered during the execution of the course. The visual interface aims to present the critical situations present in this ID, serving as a support tool for the course follow-up and possible improvements, which can be made during its execution or in the planning of a new edition of this course. The model for the ID is based on High-Level Petri Nets and the visualization forms are determined by the specific kind of the data generated by an e-course, a population of students generating sequentially dependent data.

Keywords: educational data visualization, high-level petri nets, instructional design, learning analytics

Procedia PDF Downloads 243
25099 Analysis of Users’ Behavior on Book Loan Log Based on Association Rule Mining

Authors: Kanyarat Bussaban, Kunyanuth Kularbphettong

Abstract:

This research aims to create a model for analysis of student behavior using Library resources based on data mining technique in case of Suan Sunandha Rajabhat University. The model was created under association rules, apriori algorithm. The results were found 14 rules and the rules were tested with testing data set and it showed that the ability of classify data was 79.24 percent and the MSE was 22.91. The results showed that the user’s behavior model by using association rule technique can use to manage the library resources.

Keywords: behavior, data mining technique, a priori algorithm, knowledge discovery

Procedia PDF Downloads 404
25098 Prevalence of Depression among Post Stroke Survivors in South Asian Region: A Systematic Review and Meta-Analysis

Authors: Roseminu Varghese, Laveena Anitha Barboza, Jyothi Chakrabarty, Ravishankar

Abstract:

Depression among post-stroke survivors is prevalent, but it is unidentified. The purpose of this review was to determine the pooled prevalence of depression among post-stroke survivors in the South Asian region from all published health sciences research articles. The review also aimed to analyze the disparities in the prevalence of depression among the post-stroke survivors from different study locations. Data search to identify the relevant research articles published from 2005 to 2016 was done by using mesh terms and keywords in Web of Science, PubMed Medline, CINAHL, Scopus, J gate, IndMED databases. The final analysis comprised of 9 studies, including a population of 1,520 men and women. Meta-analysis was performed in STATA version 13.0. The overall pooled post-stroke depression prevalence was 0.46, 95% (CI), (0.3- 0.62). The prevalence rate in this systematic review is evident of depression among post-stroke survivors in the South Asian Region. Identifying the prevalence of post-stroke depression at an early stage is important to improve outcomes of the rehabilitative process of stroke survivors and for its early intervention.

Keywords: depression, post stroke survivors, prevalence, systematic review

Procedia PDF Downloads 158
25097 Exploration of RFID in Healthcare: A Data Mining Approach

Authors: Shilpa Balan

Abstract:

Radio Frequency Identification, also popularly known as RFID is used to automatically identify and track tags attached to items. This study focuses on the application of RFID in healthcare. The adoption of RFID in healthcare is a crucial technology to patient safety and inventory management. Data from RFID tags are used to identify the locations of patients and inventory in real time. Medical errors are thought to be a prominent cause of loss of life and injury. The major advantage of RFID application in healthcare industry is the reduction of medical errors. The healthcare industry has generated huge amounts of data. By discovering patterns and trends within the data, big data analytics can help improve patient care and lower healthcare costs. The number of increasing research publications leading to innovations in RFID applications shows the importance of this technology. This study explores the current state of research of RFID in healthcare using a text mining approach. No study has been performed yet on examining the current state of RFID research in healthcare using a data mining approach. In this study, related articles were collected on RFID from healthcare journal and news articles. Articles collected were from the year 2000 to 2015. Significant keywords on the topic of focus are identified and analyzed using open source data analytics software such as Rapid Miner. These analytical tools help extract pertinent information from massive volumes of data. It is seen that the main benefits of adopting RFID technology in healthcare include tracking medicines and equipment, upholding patient safety, and security improvement. The real-time tracking features of RFID allows for enhanced supply chain management. By productively using big data, healthcare organizations can gain significant benefits. Big data analytics in healthcare enables improved decisions by extracting insights from large volumes of data.

Keywords: RFID, data mining, data analysis, healthcare

Procedia PDF Downloads 233
25096 The Importance of Knowledge Innovation for External Audit on Anti-Corruption

Authors: Adel M. Qatawneh

Abstract:

This paper aimed to determine the importance of knowledge innovation for external audit on anti-corruption in the entire Jordanian bank companies are listed in Amman Stock Exchange (ASE). The study importance arises from the need to recognize the Knowledge innovation for external audit and anti-corruption as the development in the world of business, the variables that will be affected by external audit innovation are: reliability of financial data, relevantly of financial data, consistency of the financial data, Full disclosure of financial data and protecting the rights of investors to achieve the objectives of the study a questionnaire was designed and distributed to the society of the Jordanian bank are listed in Amman Stock Exchange. The data analysis found out that the banks in Jordan have a positive importance of Knowledge innovation for external audit on anti-corruption. They agree on the benefit of Knowledge innovation for external audit on anti-corruption. The statistical analysis showed that Knowledge innovation for external audit had a positive impact on the anti-corruption and that external audit has a significantly statistical relationship with anti-corruption, reliability of financial data, consistency of the financial data, a full disclosure of financial data and protecting the rights of investors.

Keywords: knowledge innovation, external audit, anti-corruption, Amman Stock Exchange

Procedia PDF Downloads 465
25095 Automated End-to-End Pipeline Processing Solution for Autonomous Driving

Authors: Ashish Kumar, Munesh Raghuraj Varma, Nisarg Joshi, Gujjula Vishwa Teja, Srikanth Sambi, Arpit Awasthi

Abstract:

Autonomous driving vehicles are revolutionizing the transportation system of the 21st century. This has been possible due to intensive research put into making a robust, reliable, and intelligent program that can perceive and understand its environment and make decisions based on the understanding. It is a very data-intensive task with data coming from multiple sensors and the amount of data directly reflects on the performance of the system. Researchers have to design the preprocessing pipeline for different datasets with different sensor orientations and alignments before the dataset can be fed to the model. This paper proposes a solution that provides a method to unify all the data from different sources into a uniform format using the intrinsic and extrinsic parameters of the sensor used to capture the data allowing the same pipeline to use data from multiple sources at a time. This also means easy adoption of new datasets or In-house generated datasets. The solution also automates the complete deep learning pipeline from preprocessing to post-processing for various tasks allowing researchers to design multiple custom end-to-end pipelines. Thus, the solution takes care of the input and output data handling, saving the time and effort spent on it and allowing more time for model improvement.

Keywords: augmentation, autonomous driving, camera, custom end-to-end pipeline, data unification, lidar, post-processing, preprocessing

Procedia PDF Downloads 123
25094 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues

Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid

Abstract:

New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.

Keywords: information visualization, visual analytics, text mining, visual text analytics tools, big data visualization

Procedia PDF Downloads 399
25093 Churn Prediction for Telecommunication Industry Using Artificial Neural Networks

Authors: Ulas Vural, M. Ergun Okay, E. Mesut Yildiz

Abstract:

Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.

Keywords: customer relationship management, churn prediction, telecom industry, deep learning, artificial neural networks

Procedia PDF Downloads 147
25092 The Face Sync-Smart Attendance

Authors: Bekkem Chakradhar Reddy, Y. Soni Priya, Mathivanan G., L. K. Joshila Grace, N. Srinivasan, Asha P.

Abstract:

Currently, there are a lot of problems related to marking attendance in schools, offices, or other places. Organizations tasked with collecting daily attendance data have numerous concerns. There are different ways to mark attendance. The most commonly used method is collecting data manually by calling each student. It is a longer process and problematic. Now, there are a lot of new technologies that help to mark attendance automatically. It reduces work and records the data. We have proposed to implement attendance marking using the latest technologies. We have implemented a system based on face identification and analyzing faces. The project is developed by gathering faces and analyzing data, using deep learning algorithms to recognize faces effectively. The data is recorded and forwarded to the host through mail. The project was implemented in Python and Python libraries used are CV2, Face Recognition, and Smtplib.

Keywords: python, deep learning, face recognition, CV2, smtplib, Dlib.

Procedia PDF Downloads 58
25091 Application of IoTs Based Multi-Level Air Quality Sensing for Advancing Environmental Monitoring in Pingtung County

Authors: Men An Pan, Hong Ren Chen, Chih Heng Shih, Hsing Yuan Yen

Abstract:

Pingtung County is located in the southernmost region of Taiwan. During the winter season, pollutants due to insufficient dispersion caused by the downwash of the northeast monsoon lead to the poor air quality of the County. Through the implementation of various control methods, including the application of permits of air pollution, fee collection of air pollution, control oil fume of catering sectors, smoke detection of diesel vehicles, regular inspection of locomotives, and subsidies for low-polluting vehicles. Moreover, to further mitigate the air pollution, additional alternative controlling strategies are also carried out, such as construction site control, prohibition of open-air agricultural waste burning, improvement of river dust, and strengthening of road cleaning operations. The combined efforts have significantly reduced air pollutants in the County. However, in order to effectively and promptly monitor the ambient air quality, the County has subsequently deployed micro-sensors, with a total of 400 IoTs (Internet of Things) micro-sensors for PM2.5 and VOC detection and 3 air quality monitoring stations of the Environmental Protection Agency (EPA), covering 33 townships of the County. The covered area has more than 1,300 listed factories and 5 major industrial parks; thus forming an Internet of Things (IoTs) based multi-level air quality monitoring system. The results demonstrate that the IoTs multi-level air quality sensors combined with other strategies such as “sand and gravel dredging area technology monitoring”, “banning open burning”, “intelligent management of construction sites”, “real-time notification of activation response”, “nighthawk early bird plan with micro-sensors”, “unmanned aircraft (UAV) combined with land and air to monitor abnormal emissions”, and “animal husbandry odour detection service” etc. The satisfaction improvement rate of air control, through a 2021 public survey, reached a high percentage of 81%, an increase of 46% as compared to 2018. For the air pollution complaints for the whole year of 2021, the total number was 4213 in contrast to 7088 in 2020, a reduction rate reached almost 41%. Because of the spatial-temporal features of the air quality monitoring IoTs system by the application of microsensors, the system does assist and strengthen the effectiveness of the existing air quality monitoring network of the EPA and can provide real-time control of the air quality. Therefore, the hot spots and potential pollution locations can be timely determined for law enforcement. Hence, remarkable results were obtained for the two years. That is, both reduction of public complaints and better air quality are successfully achieved through the implementation of the present IoTs system for real-time air quality monitoring throughout Pingtung County.

Keywords: IoT, PM, air quality sensor, air pollution, environmental monitoring

Procedia PDF Downloads 73
25090 Blockchain for the Monitoring and Reporting of Carbon Emission Trading: A Case Study on Its Possible Implementation in the Danish Energy Industry

Authors: Nkechi V. Osuji

Abstract:

The use of blockchain to address the issue of climate change is increasingly a discourse among countries, industries, and stakeholders. For a long time, the European Union (EU) has been combating the issue of climate action in industries through sustainability programs. One of such programs is the EU monitoring reporting and verification (MRV) program of the EU ETS. However, the system has some key challenges and areas for improvement, which makes it inefficient. The main objective of the research is to look at how blockchain can be used to improve the inefficiency of the EU ETS program for the Danish energy industry with a focus on its monitoring and reporting framework. Applying empirical data from 13 semi-structured expert interviews, three case studies, and literature reviews, three outcomes are presented in the study. The first is on the current conditions and challenges of monitoring and reporting CO₂ emission trading. The second is putting into consideration if blockchain is the right fit to solve these challenges and how. The third stage looks at the factors that might affect the implementation of such a system and provides recommendations to mitigate these challenges. The first stage of the findings reveals that the monitoring and reporting of CO₂ emissions is a mandatory requirement by law for all energy operators under the EU ETS program. However, most energy operators are non-compliant with the program in reality, which creates a gap and causes challenges in the monitoring and reporting of CO₂ emission trading. Other challenges the study found out are the lack of transparency, lack of standardization in CO₂ accounting, and the issue of double-counting in the current system. The second stage of the research was guided by three case studies and requirement engineering (RE) to explore these identified challenges and if blockchain is the right fit to address them. This stage of the research addressed the main research question: how can blockchain be used for monitoring and reporting CO₂ emission trading in the energy industry. Through analysis of the study data, the researcher developed a conceptual private permissioned Hyperledger blockchain and elucidated on how it can address the identified challenges. Particularly, the smart contract of blockchain was highlighted as a key feature. This is because of its ability to automate, be immutable, and digitally enforce negotiations without a middleman. These characteristics are unique in solving the issue of compliance, transparency, standardization, and double counting identified. The third stage of the research presents technological constraints and a high level of stakeholder collaboration as major factors that might affect the implementation of the proposed system. The proposed conceptual model requires high-level integration with other technologies such as the Internet of Things (IoT) and machine learning. Therefore, the study encourages future research in these areas. This is because blockchain is continually evolving its technology capabilities. As such, it remains a topic of interest in research and development for addressing climate change. Such a study is a good contribution to creating sustainable practices to solve the global climate issue.

Keywords: blockchain, carbon emission trading, European Union emission trading system, monitoring and reporting

Procedia PDF Downloads 129
25089 Geographical Data Visualization Using Video Games Technologies

Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava

Abstract:

In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.

Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material

Procedia PDF Downloads 246
25088 Social Media Consumption Habits within the Millennial Generation: A Comparison between U.S. And Bangladesh

Authors: Didarul Islam Manik

Abstract:

The study was conducted to determine social media usage by the Millennial/young-adult generation in the U.S. and Bangladesh. It investigated what types of social media Millennials/young-adults use in their everyday lives; for what purpose they use social media; what are the significant differences between the two cultures in terms of social media use; and how the age of the respondents correlates with differences in social media use. Among the 409 respondents, 200 were selected from the University of South Dakota and 209 from the University of Dhaka, Bangladesh. The convenience sampling method was used to select the samples. A four-page questionnaire instrument was constructed with 19 closed-ended questions that collected 87 data points. The study considered the uses and gratifications and domestication of technology models as theoretical frameworks. The study found that the Millennials spend an average of 4.5 hours on the Internet daily. They spend an average of 134 minutes on social media every day. However, the U.S. Millennials spend more time (141 minutes) on social media than the Bangladeshis (127 minutes). The U.S. Millennials use various types of social media including Facebook, Twitter, YouTube, Instagram, Pinterest, SnapChat, Reddit, Imgur, etc. In contrast, Bangladeshis use Facebook, YouTube, and Google plus+. The Bangladeshis tended to spend more time on Facebook (107 minutes) than the Americans (57 minutes). The study found that the Millennials of the two countries use Facebook to fill their free time, acquire information, seek entertainment, and maintain existing relationships. However, Bangladeshis are more likely to use Facebook for the acquisition of information, entertainment, educational purposes, and connecting with the people closest to them. Millennials also use Twitter to fill their free time, acquire information, and for entertainment. The study found a statistically significant difference between female and male social media use. It also found a significant correlation between age and using Facebook for educational purposes; age and discussing and posting religious issues; and age and meeting with new people. There is also a correlation between age and the use of Twitter for spending time and seeking entertainment.

Keywords: American study, social media, millennial generation, South Asian studies

Procedia PDF Downloads 234
25087 Nonparametric Sieve Estimation with Dependent Data: Application to Deep Neural Networks

Authors: Chad Brown

Abstract:

This paper establishes general conditions for the convergence rates of nonparametric sieve estimators with dependent data. We present two key results: one for nonstationary data and another for stationary mixing data. Previous theoretical results often lack practical applicability to deep neural networks (DNNs). Using these conditions, we derive convergence rates for DNN sieve estimators in nonparametric regression settings with both nonstationary and stationary mixing data. The DNN architectures considered adhere to current industry standards, featuring fully connected feedforward networks with rectified linear unit activation functions, unbounded weights, and a width and depth that grows with sample size.

Keywords: sieve extremum estimates, nonparametric estimation, deep learning, neural networks, rectified linear unit, nonstationary processes

Procedia PDF Downloads 41
25086 Impact of Gender Difference on Crop Productivity: The Case of Decha Woreda, Ethiopia

Authors: Getinet Gezahegn Gebre

Abstract:

The study examined the impact of gender differences on Crop productivity in Decha woreda of southwest Kafa zone, located 140 Km from Jimma Town and 460 km southwest of Addis Ababa, between Bonga town and Omo River. The specific objectives were to assess the extent to which the agricultural production system is gender oriented, to examine access and control over productive resources, and to estimate men’s and women’s productivity in agriculture. Cross-sectional data collected from a total of 140 respondents were used in this study, whereby 65 were female-headed and 75 were male-headed households. The data were analyzed by using Statistical Package for Social Science (SPSS). Descriptive statistics such as frequency, mean, percentage, t-test and chi-square were used to summarize and compare the information between the two groups. Moreover, Cobb-Douglas(CD) production function was used to estimate the productivity difference in agriculture between male and female-headed households. Results of the study showed that male-headed households (MHH) own more productive resources such as land, livestock, labor and other agricultural inputs as compared to female-headed households (FHH). Moreover, the estimate of CD production function shows that livestock, herbicide use, land size and male labor were statistically significant for MHH, while livestock, land size, herbicides use and female labor were significant variables for FHH. The crop productivity difference between MHH and FHH was about 68.83% in the study area. However, if FHH had equal access to the inputs as MHH, the gross value of the output would be higher by 23.58% for FHH. This might suggest that FHH would be more productive than MHH if they had equal access to inputs as MHH. Based on the results obtained, the following policy implication can be drawn: accessing FHH to inputs that increase the productivity of agriculture, such as herbicides, livestock and male labor; increasing the productivity of land; and introducing technologies that reduce the time and energy of women, especially for enset processing.

Keywords: gender difference, crop productivity, GDP, efficiency

Procedia PDF Downloads 74
25085 The Post-Hegemony of Post-Capitalism: Towards a Political Theory of Open Cooperativism

Authors: Vangelis Papadimitropoulos

Abstract:

The paper is part of the research project “Techno-Social Innovation in the Collaborative Economy'', funded by the Hellenic Foundation of Research and Innovation for the years 2022-2024. The research project examines the normative and empirical conditions of grassroots technologically driven innovation, potentially enabling the transition towards a commons-oriented post-capitalist economy. The project carries out a conceptually led and empirically grounded multi-case study of the digital commons, open-source technologies, platform cooperatives, open cooperatives and Distributed Autonomous Organizations (DAOs) on the Blockchain. The methodological scope of research is interdisciplinary inasmuch as it comprises political theory, economics, sustainability science and computer science, among others. The research draws specifically on Michel Bauwens and Vasilis Kostakis' model of open cooperativism between the commons, ethical market entities and a partner state. Bauwens and Kostakis advocate for a commons-based counter-hegemonic post-capitalist transition beyond and against neoliberalism. The research further employs Laclau and Mouffe's discourse theory of hegemony to introduce a post-hegemonic conceptualization of the model of open cooperativism. Thus, the paper aims to outline the theoretical contribution of the research project to contemporary political theory debates on post-capitalism and the collaborative economy.

Keywords: open cooperativism, techno-social innovation, post-hegemony, post-capitalism

Procedia PDF Downloads 66
25084 Development of Risk Management System for Urban Railroad Underground Structures and Surrounding Ground

Authors: Y. K. Park, B. K. Kim, J. W. Lee, S. J. Lee

Abstract:

To assess the risk of the underground structures and surrounding ground, we collect basic data by the engineering method of measurement, exploration and surveys and, derive the risk through proper analysis and each assessment for urban railroad underground structures and surrounding ground including station inflow. Basic data are obtained by the fiber-optic sensors, MEMS sensors, water quantity/quality sensors, tunnel scanner, ground penetrating radar, light weight deflectometer, and are evaluated if they are more than the proper value or not. Based on these data, we analyze the risk level of urban railroad underground structures and surrounding ground. And we develop the risk management system to manage efficiently these data and to support a convenient interface environment at input/output of data.

Keywords: urban railroad, underground structures, ground subsidence, station inflow, risk

Procedia PDF Downloads 336
25083 Smartphone Video Source Identification Based on Sensor Pattern Noise

Authors: Raquel Ramos López, Anissa El-Khattabi, Ana Lucila Sandoval Orozco, Luis Javier García Villalba

Abstract:

An increasing number of mobile devices with integrated cameras has meant that most digital video comes from these devices. These digital videos can be made anytime, anywhere and for different purposes. They can also be shared on the Internet in a short period of time and may sometimes contain recordings of illegal acts. The need to reliably trace the origin becomes evident when these videos are used for forensic purposes. This work proposes an algorithm to identify the brand and model of mobile device which generated the video. Its procedure is as follows: after obtaining the relevant video information, a classification algorithm based on sensor noise and Wavelet Transform performs the aforementioned identification process. We also present experimental results that support the validity of the techniques used and show promising results.

Keywords: digital video, forensics analysis, key frame, mobile device, PRNU, sensor noise, source identification

Procedia PDF Downloads 428
25082 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: big data, machine learning, smart city, social cost, transportation network

Procedia PDF Downloads 260
25081 Dental Students' Acquired Knowledge of the Pre-Contemplation Stage of Change

Authors: S. Curtin, A. Trace

Abstract:

Introduction: As patients can often be ambivalent about or resistant to any change in their smoking behavior the traditional ‘5 A’ model may be limited as it assumes that patients are ready and motivated to change. However, there is a stage model that is helpful to give guidance for dental students: the Transtheoretical Model (TTM). This model allows students to understand the tasks and goals for the pre-contemplation stage. The TTM was introduced in early stages as a core component of a smoking cessation programme that was integrated into a Behavioral Science programme as applied to dentistry. The aim of the present study is to evaluate and illustrate the students’ current level of knowledge from the questions the students generated in order to engage patients in the tasks and goals of the pre-contemplation stage. Method: N=47 responses of fifth-year undergraduate dental students. These responses were the data set for this study and related to their knowledge base of appropriate questions for a dentist to ask at the pre-contemplation stage of change. A deductive -descriptive analysis was conducted on the data. The goals and tasks of the pre-contemplation stage of the TTM provided a template for this deductive analysis. Results: 51% of students generated relevant, open, exploratory questions for the pre-contemplation stage, whilst 100% of students generated closed questions. With regard to those questions appropriate for the pre-contemplation stage, 19% were open and exploratory, while 66% were closed questions. A deductive analysis of the open exploratory questions revealed that 53% of the questions addressed increased concern about the current pattern of behavior, 38% of the questions concerned increased awareness of a need for change and only 8% of the questions dealt with the envisioning of the possibility of change. Conclusion: All students formulated relevant questions for the pre-contemplation stage, and half of the students generated the open, exploratory questions that increased patients’ awareness of the need to change. More training is required to facilitate a shift in the formulation from closed to open questioning, especially given that, traditionally, smoking cessation was modeled on the ‘5 As’, and that the general training for dentists supports an advisory and directive approach.

Keywords: behaviour change, pre-contemplation stage, trans-theoretical model, undergraduate dentistry students

Procedia PDF Downloads 413
25080 Integrated Model for Enhancing Data Security Performance in Cloud Computing

Authors: Amani A. Saad, Ahmed A. El-Farag, El-Sayed A. Helali

Abstract:

Cloud computing is an important and promising field in the recent decade. Cloud computing allows sharing resources, services and information among the people of the whole world. Although the advantages of using clouds are great, but there are many risks in a cloud. The data security is the most important and critical problem of cloud computing. In this research a new security model for cloud computing is proposed for ensuring secure communication system, hiding information from other users and saving the user's times. In this proposed model Blowfish encryption algorithm is used for exchanging information or data, and SHA-2 cryptographic hash algorithm is used for data integrity. For user authentication process a user-name and password is used, the password uses SHA-2 for one way encryption. The proposed system shows an improvement of the processing time of uploading and downloading files on the cloud in secure form.

Keywords: cloud Ccomputing, data security, SAAS, PAAS, IAAS, Blowfish

Procedia PDF Downloads 477
25079 The Impact of Information and Communications Technology (ICT)-Enabled Service Adaptation on Quality of Life: Insights from Taiwan

Authors: Chiahsu Yang, Peiling Wu, Ted Ho

Abstract:

From emphasizing economic development to stressing public happiness, the international community mainly hopes to be able to understand whether the quality of life for the public is becoming better. The Better Life Index (BLI) constructed by OECD uses living conditions and quality of life as starting points to cover 11 areas of life and to convey the state of the general public’s well-being. In light of the BLI framework, the Directorate General of Budget, Accounting and Statistics (DGBAS) of the Executive Yuan instituted the Gross National Happiness Index to understand the needs of the general public and to measure the progress of the aforementioned conditions in residents across the island. Whereas living conditions consist of income and wealth, jobs and earnings, and housing conditions, health status, work and life balance, education and skills, social connections, civic engagement and governance, environmental quality, personal security. The ICT area consists of health care, living environment, ICT-enabled communication, transportation, government, education, pleasure, purchasing, job & employment. In the wake of further science and technology development, rapid formation of information societies, and closer integration between lifestyles and information societies, the public’s well-being within information societies has indeed become a noteworthy topic. the Board of Science and Technology of the Executive Yuan use the OECD’s BLI as a reference in the establishment of the Taiwan-specific ICT-Enabled Better Life Index. Using this index, the government plans to examine whether the public’s quality of life is improving as well as measure the public’s satisfaction with current digital quality of life. This understanding will enable the government to gauge the degree of influence and impact that each dimension of digital services has on digital life happiness while also serving as an important reference for promoting digital service development. The content of the ICT Enabled Better Life Index. Information and communications technology (ICT) has been affecting people’s living styles, and further impact people’s quality of life (QoL). Even studies have shown that ICT access and usage have both positive and negative impact on life satisfaction and well-beings, many governments continue to invest in e-government programs to initiate their path to information society. This research is the few attempts to link the e-government benchmark to the subjective well-being perception, and further address the gap between user’s perception and existing hard data assessment, then propose a model to trace measurement results back to the original public policy in order for policy makers to justify their future proposals.

Keywords: information and communications technology, quality of life, satisfaction, well-being

Procedia PDF Downloads 355
25078 Cursive Handwriting in an Internet Age

Authors: Karen Armstrong

Abstract:

Recent concerns about the value of teaching cursive handwriting in the classroom are based on the belief that cursive handwriting or penmanship is an outdated and unnecessary skill in today’s online world. The discussion of this issue begins with a description of current initiatives to eliminate handwriting instruction in schools. This is followed by a brief history of cursive writing through the ages. Next considered is a description of its benefits as a preliminary process for younger children as compared with immediate instruction in keyboarding, particularly in the areas of vision, cognition, motor skills and automatic fluency. Also considered, is cursive’s companion, paper itself, and the impact of a paperless, “screen and keyboard” environment. The discussion concludes with a consideration of the unique contributions of cursive and keyboarding as written forms of communication, along with their respective surfaces, paper and screen. Finally, an assessment of the practical utility of each skill is followed by an informal assessment of what is lost and what remains as we move from a predominantly paper and pen world of handwriting to texting and keyboarding in an environment of screens.

Keywords: asemic writing, cursive, handwriting, keyboarding, paper

Procedia PDF Downloads 271
25077 Attitude Towards E-Learning: A Case of University Teachers and Students

Authors: Muhamamd Shahid Farooq, Maazan Zafar, Rizawana Akhtar

Abstract:

E-learning technologies are the blessings of advancements in science and technology. These facilitate the learners to get information at any place and any time by improving their self-confidence, self-efficacy and effectiveness in teaching learning process. E-learning provides an individualized learning experience for learners and remove barriers faced by students during new and creative ways of gaining information. It provides a wide range of facilities to enable the teachers and students for effective and purposeful learning. This study was conducted to explore the attitudes of university students and teachers towards e-learning working in a metropolitan university of Pakistan. The personal, institutional and technological characteristics of the teachers and students of higher education institution effect the adoption of e-learning. For this descriptive study 449 students and 35 university teachers were surveyed by using a Likert scale type questionnaire consisting of 52 statements relating to six factors "perceived usefulness, intention to adopt e-learning, ease of e-learning use, availability resources, e-learning stressors, and pressure to use e-learning". Data were analyzed by making comparisons on the basis of different demographic factors. The findings of the study show that both type of respondents have positive attitude towards e-learning. However, the male and female respondents differ in their opinion for e-learning implementation.

Keywords: e-learning, ICT, e-sources of learning, questionnaire

Procedia PDF Downloads 527
25076 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: recurrent neural network, players lineup, basketball data, decision making model

Procedia PDF Downloads 133
25075 Interactive and Innovative Environments for Modeling Digital Educational Games and Animations

Authors: Ida Srdić, Luka Mandić, LidijaMandić

Abstract:

Digitization and intensive use of tablets, smartphones, the internet, mobile, and web applications have massively disrupted our habits, and the way audiences (especially youth) consume content. To introduce educational content in games and animations, and at the same time to keep it interesting and compelling for kids, is a challenge. In our work, we are comparing the different possibilities and potentials that digital games could provide to successfully mitigate direct connection with education. We analyze the main directions and educational methods in game-based learning and the possibilities of interactive modeling through questionnaires for user experience and requirements. A pre and post-quantitative survey will be conducted in order to measure levels of objective knowledge as well as the games perception. This approach enables quantitative and objective evaluation of the impact the game has on participants. Also, we will discuss the main barriers to the use of games in education and how games can be best used for learning.

Keywords: Bloom’s taxonomy, epistemic games, learning objectives, virtual learning environments

Procedia PDF Downloads 98
25074 Semi-automatic Design and Fabrication of Ring-Bell Control by IoT

Authors: Samart Rungjarean, Benchalak Muangmeesri, Dechrit Maneetham

Abstract:

Monks' and Novices' chimes may have some restrictions, such as during the rain when a structure or location chimes or at a certain period. Alternately, certain temple bells may be found atop a tall, difficult-to-reach bell tower. As a result, the concept of designing a brass bell for use with a mobile phone over great distances was proposed. The Internet of Things (IoT) system will be used to regulate the bell by testing each of the three beatings with a wooden head. A stone-beating head and a steel beater. The sound resonates nicely, with the distance and rhythm of the hit contributing to this. An ESP8266 microcontroller is used by the control system to manage its operations and will communicate with the pneumatic system to convey a signal. Additionally, a mobile phone will be used to operate the entire system. In order to precisely direct and regulate the rhythm, There is a resonance of roughly 50 dB for this test, and the operating distance can be adjusted. Timing and accuracy were both good.

Keywords: automatic ring-bell, microcontroller, ring-bell, iot

Procedia PDF Downloads 111
25073 Challenges in Multi-Cloud Storage Systems for Mobile Devices

Authors: Rajeev Kumar Bedi, Jaswinder Singh, Sunil Kumar Gupta

Abstract:

The demand for cloud storage is increasing because users want continuous access their data. Cloud Storage revolutionized the way how users access their data. A lot of cloud storage service providers are available as DropBox, G Drive, and providing limited free storage and for extra storage; users have to pay money, which will act as a burden on users. To avoid the issue of limited free storage, the concept of Multi Cloud Storage introduced. In this paper, we will discuss the limitations of existing Multi Cloud Storage systems for mobile devices.

Keywords: cloud storage, data privacy, data security, multi cloud storage, mobile devices

Procedia PDF Downloads 699