Search results for: R data science
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26139

Search results for: R data science

24279 Global City Typologies: 300 Cities and Over 100 Datasets

Authors: M. Novak, E. Munoz, A. Jana, M. Nelemans

Abstract:

Cities and local governments the world over are interested to employ circular strategies as a means to bring about food security, create employment and increase resilience. The selection and implementation of circular strategies is facilitated by modeling the effects of strategies locally and understanding the impacts such strategies have had in other (comparable) cities and how that would translate locally. Urban areas are heterogeneous because of their geographic, economic, social characteristics, governance, and culture. In order to better understand the effect of circular strategies on urban systems, we create a dataset for over 300 cities around the world designed to facilitate circular strategy scenario modeling. This new dataset integrates data from over 20 prominent global national and urban data sources, such as the Global Human Settlements layer and International Labour Organisation, as well as incorporating employment data from over 150 cities collected bottom up from local departments and data providers. The dataset is made to be reproducible. Various clustering techniques are explored in the paper. The result is sets of clusters of cities, which can be used for further research, analysis, and support comparative, regional, and national policy making on circular cities.

Keywords: data integration, urban innovation, cluster analysis, circular economy, city profiles, scenario modelling

Procedia PDF Downloads 170
24278 Clustering Categorical Data Using the K-Means Algorithm and the Attribute’s Relative Frequency

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

Clustering is a well known data mining technique used in pattern recognition and information retrieval. The initial dataset to be clustered can either contain categorical or numeric data. Each type of data has its own specific clustering algorithm. In this context, two algorithms are proposed: the k-means for clustering numeric datasets and the k-modes for categorical datasets. The main encountered problem in data mining applications is clustering categorical dataset so relevant in the datasets. One main issue to achieve the clustering process on categorical values is to transform the categorical attributes into numeric measures and directly apply the k-means algorithm instead the k-modes. In this paper, it is proposed to experiment an approach based on the previous issue by transforming the categorical values into numeric ones using the relative frequency of each modality in the attributes. The proposed approach is compared with a previously method based on transforming the categorical datasets into binary values. The scalability and accuracy of the two methods are experimented. The obtained results show that our proposed method outperforms the binary method in all cases.

Keywords: clustering, unsupervised learning, pattern recognition, categorical datasets, knowledge discovery, k-means

Procedia PDF Downloads 246
24277 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 21
24276 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology

Authors: Amarendar Reddy Addula

Abstract:

Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.

Keywords: artificial intelligence, ethics & human rights issues, laws, international laws

Procedia PDF Downloads 80
24275 Quality Assurance for the Climate Data Store

Authors: Judith Klostermann, Miguel Segura, Wilma Jans, Dragana Bojovic, Isadora Christel Jimenez, Francisco Doblas-Reyees, Judit Snethlage

Abstract:

The Climate Data Store (CDS), developed by the Copernicus Climate Change Service (C3S) implemented by the European Centre for Medium-Range Weather Forecasts (ECMWF) on behalf of the European Union, is intended to become a key instrument for exploring climate data. The CDS contains both raw and processed data to provide information to the users about the past, present and future climate of the earth. It allows for easy and free access to climate data and indicators, presenting an important asset for scientists and stakeholders on the path for achieving a more sustainable future. The C3S Evaluation and Quality Control (EQC) is assessing the quality of the CDS by undertaking a comprehensive user requirement assessment to measure the users’ satisfaction. Recommendations will be developed for the improvement and expansion of the CDS datasets and products. User requirements will be identified on the fitness of the datasets, the toolbox, and the overall CDS service. The EQC function of the CDS will help C3S to make the service more robust: integrated by validated data that follows high-quality standards while being user-friendly. This function will be closely developed with the users of the service. Through their feedback, suggestions, and contributions, the CDS can become more accessible and meet the requirements for a diverse range of users. Stakeholders and their active engagement are thus an important aspect of CDS development. This will be achieved with direct interactions with users such as meetings, interviews or workshops as well as different feedback mechanisms like surveys or helpdesk services at the CDS. The results provided by the users will be categorized as a function of CDS products so that their specific interests will be monitored and linked to the right product. Through this procedure, we will identify the requirements and criteria for data and products in order to build the correspondent recommendations for the improvement and expansion of the CDS datasets and products.

Keywords: climate data store, Copernicus, quality, user engagement

Procedia PDF Downloads 135
24274 Quantifying the Methods of Monitoring Timers in Electric Water Heater for Grid Balancing on Demand-Side Management: A Systematic Mapping Review

Authors: Yamamah Abdulrazaq, Lahieb A. Abrahim, Samuel E. Davies, Iain Shewring

Abstract:

An electric water heater (EWH) is a powerful appliance that uses electricity in residential, commercial, and industrial settings, and the ability to control them properly will result in cost savings and the prevention of blackouts on the national grid. This article discusses the usage of timers in EWH control strategies for demand-side management (DSM). Up to the authors' knowledge, there is no systematic mapping review focusing on the utilisation of EWH control strategies in DSM has yet been conducted. Consequently, the purpose of this research is to identify and examine main papers exploring EWH procedures in DSM by quantifying and categorising information with regard to publication year and source, kind of methods, and source of data for monitoring control techniques. In order to answer the research questions, a total of 31 publications published between 1999 and 2023 were selected depending on specific inclusion and exclusion criteria. The data indicate that direct load control (DLC) has been somewhat more prevalent than indirect load control (ILC). Additionally, the mixing method is much lower than the other techniques, and the proportion of Real-time data (RTD) to non-real-time data (NRTD) is about equal.

Keywords: demand side management, direct load control, electric water heater, indirect load control, non real-time data, real-time data

Procedia PDF Downloads 68
24273 Implications of Circular Economy on Users Data Privacy: A Case Study on Android Smartphones Second-Hand Market

Authors: Mariia Khramova, Sergio Martinez, Duc Nguyen

Abstract:

Modern electronic devices, particularly smartphones, are characterised by extremely high environmental footprint and short product lifecycle. Every year manufacturers release new models with even more superior performance, which pushes the customers towards new purchases. As a result, millions of devices are being accumulated in the urban mine. To tackle these challenges the concept of circular economy has been introduced to promote repair, reuse and recycle of electronics. In this case, electronic devices, that previously ended up in landfills or households, are getting the second life, therefore, reducing the demand for new raw materials. Smartphone reuse is gradually gaining wider adoption partly due to the price increase of flagship models, consequently, boosting circular economy implementation. However, along with reuse of communication device, circular economy approach needs to ensure the data of the previous user have not been 'reused' together with a device. This is especially important since modern smartphones are comparable with computers in terms of performance and amount of data stored. These data vary from pictures, videos, call logs to social security numbers, passport and credit card details, from personal information to corporate confidential data. To assess how well the data privacy requirements are followed on smartphones second-hand market, a sample of 100 Android smartphones has been purchased from IT Asset Disposition (ITAD) facilities responsible for data erasure and resell. Although devices should not have stored any user data by the time they leave ITAD, it has been possible to retrieve the data from 19% of the sample. Applied techniques varied from manual device inspection to sophisticated equipment and tools. These findings indicate significant barrier in implementation of circular economy and a limitation of smartphone reuse. Therefore, in order to motivate the users to donate or sell their old devices and make electronic use more sustainable, data privacy on second-hand smartphone market should be significantly improved. Presented research has been carried out in the framework of sustainablySMART project, which is part of Horizon 2020 EU Framework Programme for Research and Innovation.

Keywords: android, circular economy, data privacy, second-hand phones

Procedia PDF Downloads 118
24272 The Quantum Theory of Music and Languages

Authors: Mballa Abanda Serge, Henda Gnakate Biba, Romaric Guemno Kuate, Akono Rufine Nicole, Petfiang Sidonie, Bella Sidonie

Abstract:

The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization, It designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and world music or variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.

Keywords: music, entanglement, langauge, science

Procedia PDF Downloads 64
24271 Development of Muay Thai Competition Management for Promoting Sport Tourism in the next Decade (2015-2024)

Authors: Supasak Ngaoprasertwong

Abstract:

The purpose of this research was to develop a model for Muay Thai competition management for promoting sport tourism in the next decade. Moreover, the model was appropriately initiated for practical use. This study also combined several methodologies, both quantitative research and qualitative research, to entirely cover all aspects of data, especially the tourists’ satisfaction toward Muay Thai competition. The data were collected from 400 tourists watching Muay Thai competition in 4 stadiums to create the model for Muay Thai competition to support the sport tourism in the next decade. Besides, Ethnographic Delphi Futures Research (EDFR) was applied to gather the data from certain experts in boxing industry or having significant role in Muay Thai competition in both public sector and private sector. The first step of data collection was an in-depth interview with 27 experts associated with Muay Thai competition, Muay Thai management, and tourism. The second step and the third step of data collection were conducted to confirm the experts’ opinions toward various elements. When the 3 steps of data collection were completely accomplished, all data were assembled to draft the model. Then the model was proposed to 8 experts to conduct a brainstorming to affirm it. According to the results of quantitative research, it found that the tourists were satisfied with personnel of competition at high level (x=3.87), followed by facilities, services, and safe high level (x=3.67). Furthermore, they were satisfied with operation in competition field at high level (x=3.62).Regarding the qualitative methodology including literature review, theories, concepts and analysis of qualitative research development of the model for Muay Thai competition to promote the sport tourism in the next decade, the findings indicated that there were 2 data sets as follows: The first one was related to Muay Thai competition to encourage the sport tourism and the second one was associated with Muay Thai stadium management to support the sport tourism. After the brain storming, “EE Muay Thai Model” was finally developed for promoting the sport tourism in the next decade (2015-2024).

Keywords: Muay Thai competition management, Muay Thai sport tourism, Muay Thai, Muay Thai for sport tourism management

Procedia PDF Downloads 306
24270 Multicultural Education in the National Context: A Study of Peoples' Friendship University of Russia

Authors: Maria V. Mishatkina

Abstract:

The modelling of dialogical environment is an essential feature of modern education. The dialogue of cultures is a foundation and an important prerequisite for a formation of a human’s main moral qualities such as an ability to understand another person, which is manifested in such values as tolerance, respect, mutual assistance and mercy. A formation of a modern expert occurs in an educational environment that is significantly different from what we had several years ago. Nowadays university education has qualitatively new characteristics. They may be observed in Peoples’ Friendship University of Russia (RUDN University), a top Russian higher education institution which unites representatives of more than 150 countries. The content of its educational strategies is not an adapted cultural experience but material between science and innovation. Besides, RUDN University’s profiles and specialization are not equal to the professional structures. People study not a profession in a strict sense but a basic scientific foundation of an activity in different socio-cultural areas (science, business and education). RUDN University also provides a considerable unit of professional education components. They are foreign languages skills, economic, political, ethnic, communication and computer culture, theory of information and basic management skills. Moreover, there is a rich social life (festive multicultural events, theme parties, journeys) and prospects concerning the inclusive approach to education (for example, a special course ‘Social Pedagogy: Issues of Tolerance’). In our research, we use such methods as analysis of modern and contemporary scientific literature, opinion poll (involving students, teachers and research workers) and comparative data analysis. We came to the conclusion that knowledge transfer of RUDN student in the activity happens through making goals, problems, issues, tasks and situations which simulate future innovative ambiguous environment that potentially prepares him/her to dialogical way of life. However, all these factors may not take effect if there is no ‘personal inspiration’ of students by communicative and dialogic values, their participation in a system of meanings and tools of learning activity that is represented by cooperation within the framework of scientific and pedagogical schools dialogue. We also found out that dominating strategies of ensuring the quality of education are those that put students in the position of the subject of their own education. Today these strategies and approaches should involve such approaches and methods as task, contextual, modelling, specialized, game-imitating and dialogical approaches, the method of practical situations, etc. Therefore, University in the modern sense is not only an educational institution, but also a generator of innovation, cooperation among nations and cultural progress. RUDN University has been performing exactly this mission for many decades.

Keywords: dialogical developing situation, dialogue of cultures, readiness for dialogue, university graduate

Procedia PDF Downloads 204
24269 Interpretation and Clustering Framework for Analyzing ECG Survey Data

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-Pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 454
24268 A Study on the Use Intention of Smart Phone

Authors: Zhi-Zhong Chen, Jun-Hao Lu, Jr., Shih-Ying Chueh

Abstract:

Based on Unified Theory of Acceptance and Use of Technology (UTAUT), the study investigates people’s intention on using smart phones. The study additionally incorporates two new variables: 'self-efficacy' and 'attitude toward using'. Samples are collected by questionnaire survey, in which 240 are valid. After Correlation Analysis, Reliability Test, ANOVA, t-test and Multiple Regression Analysis, the study finds that social impact and self-efficacy have positive effect on use intentions, and the use intentions also have positive effect on use behavior.

Keywords: [1] Ajzen & Fishbein (1975), “Belief, attitude, intention and behavior: An introduction to theory and research”, Reading MA: Addison-Wesley. [2] Bandura (1977) Self-efficacy: toward a unifying theory of behavioural change. Psychological Review , 84, 191–215. [3] Bandura( 1986) A. Bandura, Social foundations of though and action, Prentice-Hall. Englewood Cliffs. [4] Ching-Hui Huang (2005). The effect of Regular Exercise on Elderly Optimism: The Self-efficacy and Theory of Reasoned Action Perspectives.(Master's dissertation, National Taiwan Sport University, 2005).National Digital Library of Theses and Dissertations in Taiwan。 [5] Chun-Mo Wu (2007).The Effects of Perceived Risk and Service Quality on Purchase Intention - an Example of Taipei City Long-Term Care Facilities. (Master's dissertation, Ming Chuan University, 2007).National Digital Library of Theses and Dissertations in Taiwan. [6] Compeau, D.R., and Higgins, C.A., (1995) “Application of social cognitive theory to training for computer skills.”, Information Systems Research, 6(2), pp.118-143. [7] computer-self-efficacy and mediators of the efficacy-performance relationship. International Journal of Human-Computer Studies, 62, 737-758. [8] Davis et al(1989), “User acceptance of computer technology: A comparison of two theoretical models ”, Management Science, 35(8), p.982-1003. [9] Davis et al(1989), “User acceptance of computer technology:A comparison of two theoretical models ”, Management Science, 35(8), p.982-1003. [10] Davis, F.D. (1989). Perceived Usefulness, Perceived Ease of Use and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319-340。 [11] Davis. (1989). Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly, 13(3), 319–340. doi:10.2307/249008 [12] Johnson, R. D. (2005). An empirical investigation of sources of application-specific [13] Mei-yin Hsu (2010).The Study on Attitude and Satisfaction of Electronic Documents System for Administrators of Elementary Schools in Changhua County.(Master's dissertation , Feng Chia University, 2010).National Digital Library of Theses and Dissertations in Taiwan. [14] Ming-Chun Hsieh (2010). Research on Parents’ Attitudes Toward Electronic Toys: The case of Taichung City.(Master's dissertation, Chaoyang University of Technology,2010).National Digital Library of Theses and Dissertations in Taiwan. [15] Moon and Kim(2001). Extending the TAM for a World-Wide-Web context, Information and Management, v.38 n.4, p.217-230. [16] Shang-Yi Hu (2010).The Impacts of Knowledge Management on Customer Relationship Management – Enterprise Characteristicsand Corporate Governance as a Moderator.(Master's dissertation, Leader University, 2010)。National Digital Library of Theses and Dissertations in Taiwan. [17] Sheng-Yi Hung (2013, September10).Worldwide sale of smartphones to hit one billion IDC:Android dominate the market. ETtoday. Retrieved data form the available protocol:2013/10/3. [18] Thompson, R.L., Higgins, C.A., and Howell, J.M.(1991), “Personal Computing: Toward a Conceptual Model of Utilization”, MIS Quarterly(15:1), pp. 125-143. [19] Venkatesh, V., M.G. Morris, G.B. Davis, and F. D. Davis (2003), “User acceptance of information technology: Toward a unified view, ” MIS Quarterly, 27, No. 3, pp.425-478. [20] Vijayasarathy, L. R. (2004), Predicting Consumer Intentions to Use On-Line Shopping: The Case for an Augmented Technology Acceptance Model, Information and Management, Vol.41, No.6, pp.747-762. [21] Wikipedia - smartphone (http://zh.wikipedia.org/zh-tw/%E6%99%BA%E8%83%BD%E6%89%8B%E6%9C%BA)。 [22] Wu-Minsan (2008).The impacts of self-efficacy, social support on work adjustment with hearing impaired. (Master's dissertation, Southern Taiwan University of Science and Technology, 2008).National Digital Library of Theses and Dissertations in Taiwan. [23] Yu-min Lin (2006). The Influence of Business Employee’s MSN Self-efficacy On Instant Messaging Usage Behavior and Communicaiton Satisfaction.(Master's dissertation, National Taiwan University of Science and Technology, 2006).National Digital Library of Theses and Dissertations in Taiwan.

Procedia PDF Downloads 393
24267 LiDAR Based Real Time Multiple Vehicle Detection and Tracking

Authors: Zhongzhen Luo, Saeid Habibi, Martin v. Mohrenschildt

Abstract:

Self-driving vehicle require a high level of situational awareness in order to maneuver safely when driving in real world condition. This paper presents a LiDAR based real time perception system that is able to process sensor raw data for multiple target detection and tracking in dynamic environment. The proposed algorithm is nonparametric and deterministic that is no assumptions and priori knowledge are needed from the input data and no initializations are required. Additionally, the proposed method is working on the three-dimensional data directly generated by LiDAR while not scarifying the rich information contained in the domain of 3D. Moreover, a fast and efficient for real time clustering algorithm is applied based on a radially bounded nearest neighbor (RBNN). Hungarian algorithm procedure and adaptive Kalman filtering are used for data association and tracking algorithm. The proposed algorithm is able to run in real time with average run time of 70ms per frame.

Keywords: lidar, segmentation, clustering, tracking

Procedia PDF Downloads 399
24266 The Nature and the Structure of Scientific and Innovative Collaboration Networks

Authors: Afshin Moazami, Andrea Schiffauerova

Abstract:

The objective of this work is to investigate the development and the role of collaboration networks in the creation of knowledge and innovations in the US and Canada, with a special focus on Quebec. In order to create scientific networks, the data on journal articles were extracted from SCOPUS, and the networks were built based on the co-authorship of the journal papers. For innovation networks, the USPTO database was used, and the networks were built on the patent co-inventorship. Various indicators characterizing the evolution of the network structure and the positions of the researchers and inventors in the networks were calculated. The comparison between the United States, Canada, and Quebec was then carried out. The preliminary results show that the nature of scientific collaboration networks differs from the one seen in innovation networks. Scientists work in bigger teams and are mostly interconnected within one giant network component, whereas the innovation network is much more clustered and fragmented, the inventors work more repetitively with the same partners, often in smaller isolated groups. In both Canada and the US, an increasing tendency towards collaboration was observed, and it was found that networks are getting bigger and more centralized with time. Moreover, a declining share of knowledge transfers per scientist was detected, suggesting an increasing specialization of science. The US collaboration networks tend to be more centralized than the Canadian ones. Quebec shares a lot of features with the Canadian network, but some differences were observed, for example, Quebec inventors rely more on the knowledge transmission through intermediaries.

Keywords: Canada, collaboration, innovation network, scientific network, Quebec, United States

Procedia PDF Downloads 185
24265 On-line Control of the Natural and Anthropogenic Safety in Krasnoyarsk Region

Authors: T. Penkova, A. Korobko, V. Nicheporchuk, L. Nozhenkova, A. Metus

Abstract:

This paper presents an approach of on-line control of the state of technosphere and environment objects based on the integration of Data Warehouse, OLAP and Expert systems technologies. It looks at the structure and content of data warehouse that provides consolidation and storage of monitoring data. There is a description of OLAP-models that provide a multidimensional analysis of monitoring data and dynamic analysis of principal parameters of controlled objects. The authors suggest some criteria of emergency risk assessment using expert knowledge about danger levels. It is demonstrated now some of the proposed solutions could be adopted in territorial decision making support systems. Operational control allows authorities to detect threat, prevent natural and anthropogenic emergencies and ensure a comprehensive safety of territory.

Keywords: decision making support systems, emergency risk assessment, natural and anthropogenic safety, on-line control, territory

Procedia PDF Downloads 392
24264 Geomagnetic Jerks Observed in Geomagnetic Observatory Data Over Southern Africa Between 2017 and 2023

Authors: Sanele Lionel Khanyile, Emmanuel Nahayo

Abstract:

Geomagnetic jerks are jumps observed in the second derivative of the main magnetic field that occurs on annual to decadal timescales. Understanding these jerks is crucial as they provide valuable insights into the complex dynamics of the Earth’s outer liquid core. In this study, we investigate the occurrence of geomagnetic jerks in geomagnetic observatory data collected at southern African magnetic observatories, Hermanus (HER), Tsumeb (TSU), Hartebeesthoek (HBK) and Keetmanshoop (KMH) between 2017 and 2023. The observatory data was processed and analyzed by retaining quiet night-time data recorded during quiet geomagnetic activities with the help of Kp, Dst, and ring current RC indices. Results confirm the occurrence of the 2019-2020 geomagnetic jerk in the region and identify the recent 2021 jerk detected with V-shaped secular variation changes in X and Z components at all four observatories. The highest estimated 2021 jerk secular acceleration amplitudes in X and Z components were found at HBK, 12.7 nT/year² and 19. 1 nT/year², respectively. Notably, the global CHAOS-7 model aptly identifies this 2021 jerk in the Z component at all magnetic observatories in the region.

Keywords: geomagnetic jerks, secular variation, magnetic observatory data, South Atlantic Anomaly

Procedia PDF Downloads 48
24263 Iron Deficiency and Iron Deficiency Anaemia/Anaemia as a Diagnostic Indicator for Coeliac Disease: A Systematic Review With Meta-Analysis

Authors: Sahar Shams

Abstract:

Coeliac disease (CD) is a widely reported disease particularly in countries with predominant Caucasian populations. It presents with many signs and symptoms including iron deficiency (ID) and iron deficiency anaemia/anaemia (IDA/A). The exact association between ID, IDA/A and CD and how accurate these signs are in diagnosing CD is not fully known. This systematic review was conducted to investigate the accuracy of both ID & IDA/A as a diagnostic indicator for CD and whether it warrants point of care testing. A systematic review was performed looking at studies published in MEDLINE, Embase, Cochrane Library, and Web of Science. QUADAS-2 tool was used to assess risk of bias in each study. ROC curve and forest plots were generated as part of the meta-analysis after data extraction. 16 studies were identified in total, 13 of which were IDA/A studies and 3 ID studies. The prevalence of CD regardless of diagnostic indicator was assumed as 1%. The QUADAS-2 tool indicated most of studies as having high risk of bias. The PPV for CD was higher in those with ID than for those with IDA/A. Meta-analysis showed the overall odds of having CD is 5 times higher in individuals with ID & IDA/A. The ROC curve showed that there is definitely an association between both diagnostic indicators and CD, the association is not a particularly strong one due to great heterogeneity between studies. Whilst an association between IDA/A & ID and coeliac disease was evident, the results were not deemed significant enough to prompt coeliac disease testing in those with IDA/A & ID.

Keywords: anemia, iron deficiency anemia, coeliac disease, point of care testing

Procedia PDF Downloads 117
24262 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 97
24261 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 78
24260 AI-Based Technologies in International Arbitration: An Exploratory Study on the Practicability of Applying AI Tools in International Arbitration

Authors: Annabelle Onyefulu-Kingston

Abstract:

One of the major purposes of AI today is to evaluate and analyze millions of micro and macro data in order to determine what is relevant in a particular case and proffer it in an adequate manner. Microdata, as far as it relates to AI in international arbitration, is the millions of key issues specifically mentioned by either one or both parties or by their counsels, arbitrators, or arbitral tribunals in arbitral proceedings. This can be qualifications of expert witness and admissibility of evidence, amongst others. Macro data, on the other hand, refers to data derived from the resolution of the dispute and, consequently, the final and binding award. A notable example of this includes the rationale of the award and specific and general damages awarded, amongst others. This paper aims to critically evaluate and analyze the possibility of technological inclusion in international arbitration. This research will be imploring the qualitative method by evaluating existing literature on the consequence of applying AI to both micro and macro data in international arbitration, and how this can be of assistance to parties, counsels, and arbitrators.

Keywords: AI-based technologies, algorithms, arbitrators, international arbitration

Procedia PDF Downloads 67
24259 Detection of Hepatitis B by the Use of Artifical Intelegence

Authors: Shizra Waris, Bilal Shoaib, Munib Ahmad

Abstract:

Background; The using of clinical decision support systems (CDSSs) may recover unceasing disease organization, which requires regular visits to multiple health professionals, treatment monitoring, disease control, and patient behavior modification. The objective of this survey is to determine if these CDSSs improve the processes of unceasing care including diagnosis, treatment, and monitoring of diseases. Though artificial intelligence is not a new idea it has been widely documented as a new technology in computer science. Numerous areas such as education business, medical and developed have made use of artificial intelligence Methods: The survey covers articles extracted from relevant databases. It uses search terms related to information technology and viral hepatitis which are published between 2000 and 2016. Results: Overall, 80% of studies asserted the profit provided by information technology (IT); 75% of learning asserted the benefits concerned with medical domain;25% of studies do not clearly define the added benefits due IT. The CDSS current state requires many improvements to hold up the management of liver diseases such as HCV, liver fibrosis, and cirrhosis. Conclusion: We concluded that the planned model gives earlier and more correct calculation of hepatitis B and it works as promising tool for calculating of custom hepatitis B from the clinical laboratory data.

Keywords: detection, hapataties, observation, disesese

Procedia PDF Downloads 139
24258 A Virtual Grid Based Energy Efficient Data Gathering Scheme for Heterogeneous Sensor Networks

Authors: Siddhartha Chauhan, Nitin Kumar Kotania

Abstract:

Traditional Wireless Sensor Networks (WSNs) generally use static sinks to collect data from the sensor nodes via multiple forwarding. Therefore, network suffers with some problems like long message relay time, bottle neck problem which reduces the performance of the network. Many approaches have been proposed to prevent this problem with the help of mobile sink to collect the data from the sensor nodes, but these approaches still suffer from the buffer overflow problem due to limited memory size of sensor nodes. This paper proposes an energy efficient scheme for data gathering which overcomes the buffer overflow problem. The proposed scheme creates virtual grid structure of heterogeneous nodes. Scheme has been designed for sensor nodes having variable sensing rate. Every node finds out its buffer overflow time and on the basis of this cluster heads are elected. A controlled traversing approach is used by the proposed scheme in order to transmit data to sink. The effectiveness of the proposed scheme is verified by simulation.

Keywords: buffer overflow problem, mobile sink, virtual grid, wireless sensor networks

Procedia PDF Downloads 369
24257 Analysis of ECGs Survey Data by Applying Clustering Algorithm

Authors: Irum Matloob, Shoab Ahmad Khan, Fahim Arif

Abstract:

As Indo-pak has been the victim of heart diseases since many decades. Many surveys showed that percentage of cardiac patients is increasing in Pakistan day by day, and special attention is needed to pay on this issue. The framework is proposed for performing detailed analysis of ECG survey data which is conducted for measuring the prevalence of heart diseases statistics in Pakistan. The ECG survey data is evaluated or filtered by using automated Minnesota codes and only those ECGs are used for further analysis which is fulfilling the standardized conditions mentioned in the Minnesota codes. Then feature selection is performed by applying proposed algorithm based on discernibility matrix, for selecting relevant features from the database. Clustering is performed for exposing natural clusters from the ECG survey data by applying spectral clustering algorithm using fuzzy c means algorithm. The hidden patterns and interesting relationships which have been exposed after this analysis are useful for further detailed analysis and for many other multiple purposes.

Keywords: arrhythmias, centroids, ECG, clustering, discernibility matrix

Procedia PDF Downloads 339
24256 The Impact of Motivation on Employee Performance in South Korea

Authors: Atabong Awung Lekeazem

Abstract:

The purpose of this paper is to identify the impact or role of incentives on employee’s performance with a particular emphasis on Korean workers. The process involves defining and explaining the different types of motivation. In defining them, we also bring out the difference between the two major types of motivations. The second phase of the paper shall involve gathering data/information from a sample population and then analyzing the data. In the analysis, we shall get to see the almost similar mentality or value which Koreans attach to motivation, which a slide different view coming only from top management personnel. The last phase shall have us presenting the data and coming to a conclusion from which possible knowledge on how managers and potential managers can ignite the best out of their employees.

Keywords: motivation, employee’s performance, Korean workers, business information systems

Procedia PDF Downloads 394
24255 Design of a Portable Shielding System for a Newly Installed NaI(Tl) Detector

Authors: Mayesha Tahsin, A.S. Mollah

Abstract:

Recently, a 1.5x1.5 inch NaI(Tl) detector based gamma-ray spectroscopy system has been installed in the laboratory of the Nuclear Science and Engineering Department of the Military Institute of Science and Technology for radioactivity detection purposes. The newly installed NaI(Tl) detector has a circular lead shield of 22 mm width. An important consideration of any gamma-ray spectroscopy is the minimization of natural background radiation not originating from the radioactive sample that is being measured. Natural background gamma-ray radiation comes from naturally occurring or man-made radionuclides in the environment or from cosmic sources. Moreover, the main problem with this system is that it is not suitable for measurements of radioactivity with a large sample container like Petridish or Marinelli beaker geometry. When any laboratory installs a new detector or/and new shield, it “must” first carry out quality and performance tests for the detector and shield. This paper describes a new portable shielding system with lead that can reduce the background radiation. Intensity of gamma radiation after passing the shielding will be calculated using shielding equation I=Ioe-µx where Io is initial intensity of the gamma source, I is intensity after passing through the shield, µ is linear attenuation coefficient of the shielding material, and x is the thickness of the shielding material. The height and width of the shielding will be selected in order to accommodate the large sample container. The detector will be surrounded by a 4π-geometry low activity lead shield. An additional 1.5 mm thick shield of tin and 1 mm thick shield of copper covering the inner part of the lead shielding will be added in order to remove the presence of characteristic X-rays from the lead shield.

Keywords: shield, NaI (Tl) detector, gamma radiation, intensity, linear attenuation coefficient

Procedia PDF Downloads 139
24254 Improved Classification Procedure for Imbalanced and Overlapped Situations

Authors: Hankyu Lee, Seoung Bum Kim

Abstract:

The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.

Keywords: classification, imbalanced data with class overlap, split data space, support vector machine

Procedia PDF Downloads 297
24253 Mapping of Geological Structures Using Aerial Photography

Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash

Abstract:

Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.

Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures

Procedia PDF Downloads 673
24252 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System

Authors: Dong Seop Lee, Byung Sik Kim

Abstract:

In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.

Keywords: disaster information management, unstructured data, optical character recognition, machine learning

Procedia PDF Downloads 113
24251 A Study of the Effect of the Flipped Classroom on Mixed Abilities Classes in Compulsory Secondary Education in Italy

Authors: Giacoma Pace

Abstract:

The research seeks to evaluate whether students with impairments can achieve enhanced academic progress by actively engaging in collaborative problem-solving activities with teachers and peers, to overcome the obstacles rooted in socio-economic disparities. Furthermore, the research underscores the significance of fostering students' self-awareness regarding their learning process and encourages teachers to adopt a more interactive teaching approach. The research also posits that reducing conventional face-to-face lessons can motivate students to explore alternative learning methods, such as collaborative teamwork and peer education within the classroom. To address socio-cultural barriers it is imperative to assess their internet access and possession of technological devices, as these factors can contribute to a digital divide. The research features a case study of a Flipped Classroom Learning Unit, administered to six third-year high school classes: Scientific Lyceum, Technical School, and Vocational School, within the city of Turin, Italy. Data are about teachers and the students involved in the case study, some impaired students in each class, level of entry, students’ performance and attitude before using Flipped Classrooms, level of motivation, family’s involvement level, teachers’ attitude towards Flipped Classroom, goal obtained, the pros and cons of such activities, technology availability. The selected schools were contacted; meetings for the English teachers to gather information about their attitude and knowledge of the Flipped Classroom approach. Questionnaires to teachers and IT staff were administered. The information gathered, was used to outline the profile of the subjects involved in the study and was further compared with the second step of the study made up of a study conducted with the classes of the selected schools. The learning unit is the same, structure and content are decided together with the English colleagues of the classes involved. The pacing and content are matched in every lesson and all the classes participate in the same labs, use the same materials, homework, same assessment by summative and formative testing. Each step follows a precise scheme, in order to be as reliable as possible. The outcome of the case study will be statistically organised. The case study is accompanied by a study on the literature concerning EFL approaches and the Flipped Classroom. Document analysis method was employed, i.e. a qualitative research method in which printed and/or electronic documents containing information about the research subject are reviewed and evaluated with a systematic procedure. Articles in the Web of Science Core Collection, Education Resources Information Center (ERIC), Scopus and Science Direct databases were searched in order to determine the documents to be examined (years considered 2000-2022).

Keywords: flipped classroom, impaired, inclusivity, peer instruction

Procedia PDF Downloads 38
24250 The Impact of Academic Support Practices on Two-Year College Students’ Achievement in Science, Technology, Engineering, and Math Education: An Exploration of Factors

Authors: Gisele Ragusa, Lilian Leung

Abstract:

There are essential needs for science, technology, engineering, and math (STEM) workforces nationally. This important need underscores the necessity of increasing numbers of students attending both two-year community colleges and universities, thereby enabling and supporting a larger pool of students to enter the workforce. The greatest number of students in STEM programs attend public higher education institutions, with an even larger majority beginning their academic experiences enrolled in two-year public colleges. Accordingly, this research explores the impact of experiences and academic support practices on two-year (community) college students’ academic achievement in STEM majors with a focus on supporting students who are the first in their families to attend college. This research is a result of three years of iterative trials of differing supports to improve such students’ academic success with a cross-student comparative research methodological structure involving peer-to-peer and faculty academic supports. Results of this research indicate that background experiences and a combination of peer-to-peer and faculty-led academic support practices, including supplementary instruction, peer mentoring, and study skills support, significantly improve students’ academic success in STEM majors. These results confirm the needs that first-generation students have in navigating their college careers and what can be effective in supporting them.

Keywords: higher education policy, student support, two-year colleges, STEM achievement

Procedia PDF Downloads 74