Search results for: Data Mining
22935 Enhancing Information Technologies with AI: Unlocking Efficiency, Scalability, and Innovation
Authors: Abdal-Hafeez Alhussein
Abstract:
Artificial Intelligence (AI) has become a transformative force in the field of information technologies, reshaping how data is processed, analyzed, and utilized across various domains. This paper explores the multifaceted applications of AI within information technology, focusing on three key areas: automation, scalability, and data-driven decision-making. We delve into how AI-powered automation is optimizing operational efficiency in IT infrastructures, from automated network management to self-healing systems that reduce downtime and enhance performance. Scalability, another critical aspect, is addressed through AI’s role in cloud computing and distributed systems, enabling the seamless handling of increasing data loads and user demands. Additionally, the paper highlights the use of AI in cybersecurity, where real-time threat detection and adaptive response mechanisms significantly improve resilience against sophisticated cyberattacks. In the realm of data analytics, AI models—especially machine learning and natural language processing—are driving innovation by enabling more precise predictions, automated insights extraction, and enhanced user experiences. The paper concludes with a discussion on the ethical implications of AI in information technologies, underscoring the importance of transparency, fairness, and responsible AI use. It also offers insights into future trends, emphasizing the potential of AI to further revolutionize the IT landscape by integrating with emerging technologies like quantum computing and IoT.Keywords: artificial intelligence, information technology, automation, scalability
Procedia PDF Downloads 1722934 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network
Authors: Sandesh Achar
Abstract:
Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.
Procedia PDF Downloads 4422933 Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform
Authors: Dogaru-Ulieru Valentin, Sălișteanu Ioan Corneliu, Ardeleanu Mihăiță Nicolae, Broscăreanu Ștefan, Sălișteanu Bogdan, Mihai Mihail
Abstract:
The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day.Keywords: DiaMOTO, Codec-8, ADAS, GPS, driver monitoring
Procedia PDF Downloads 7822932 Predictive Modeling of Bridge Conditions Using Random Forest
Authors: Miral Selim, May Haggag, Ibrahim Abotaleb
Abstract:
The aging of transportation infrastructure presents significant challenges, particularly concerning the monitoring and maintenance of bridges. This study investigates the application of Random Forest algorithms for predictive modeling of bridge conditions, utilizing data from the US National Bridge Inventory (NBI). The research is significant as it aims to improve bridge management through data-driven insights that can enhance maintenance strategies and contribute to overall safety. Random Forest is chosen for its robustness, ability to handle complex, non-linear relationships among variables, and its effectiveness in feature importance evaluation. The study begins with comprehensive data collection and cleaning, followed by the identification of key variables influencing bridge condition ratings, including age, construction materials, environmental factors, and maintenance history. Random Forest is utilized to examine the relationships between these variables and the predicted bridge conditions. The dataset is divided into training and testing subsets to evaluate the model's performance. The findings demonstrate that the Random Forest model effectively enhances the understanding of factors affecting bridge conditions. By identifying bridges at greater risk of deterioration, the model facilitates proactive maintenance strategies, which can help avoid costly repairs and minimize service disruptions. Additionally, this research underscores the value of data-driven decision-making, enabling better resource allocation to prioritize maintenance efforts where they are most necessary. In summary, this study highlights the efficiency and applicability of Random Forest in predictive modeling for bridge management. Ultimately, these findings pave the way for more resilient and proactive management of bridge systems, ensuring their longevity and reliability for future use.Keywords: data analysis, random forest, predictive modeling, bridge management
Procedia PDF Downloads 2222931 Validity and Reliability of Competency Assessment Implementation (CAI) Instrument Using Rasch Model
Authors: Nurfirdawati Muhamad Hanafi, Azmanirah Ab Rahman, Marina Ibrahim Mukhtar, Jamil Ahmad, Sarebah Warman
Abstract:
This study was conducted to generate empirical evidence on validity and reliability of the item of Competency Assessment Implementation (CAI) Instrument using Rasch Model for polythomous data aided by Winstep software version 3.68. The construct validity was examined by analyzing the point-measure correlation index (PTMEA), in fit and outfit MNSQ values; meanwhile the reliability was examined by analyzing item reliability index. A survey technique was used as the major method with the CAI instrument on 156 teachers from vocational schools. The results have shown that the reliability of CAI Instrument items were between 0.80 and 0.98. PTMEA Correlation is in positive values, in which the item is able to distinguish between the ability of the respondent. Statistical data obtained shows that out of 154 items, 12 items from the instrument suggested to be omitted. This study is hoped could bring a new direction to the process of data analysis in educational research.Keywords: competency assessment, reliability, validity, item analysis
Procedia PDF Downloads 44522930 Electronic Equipment Failure due to Corrosion
Authors: Yousaf Tariq
Abstract:
There are many reasons which are involved in electronic equipment failure i.e. temperature, humidity, dust, smoke etc. Corrosive gases are also one of the factor which may involve in failure of equipment. Sensitivity of electronic equipment increased when “lead-free” regulation enforced on manufacturers. In data center, equipment like hard disk, servers, printed circuit boards etc. have been exposed to gaseous contamination due to increase in sensitivity. There is a worldwide standard to protect electronic industrial electronic from corrosive gases. It is well known as “ANSI/ISA S71.04 – 1985 - Environmental Conditions for Control Systems: Airborne Contaminants. ASHRAE Technical Committee (TC) 9.9 members also recommended ISA standard in their whitepaper on Gaseous and Particulate Contamination Guideline for data centers. TC 9.9 members represented some of the major IT equipment manufacturers e.g. IBM, HP, Cisco etc. As per standard practices, first step is to monitor air quality in data center. If contamination level shows more than G1, it means that gas-phase air filtration is required other than dust/smoke air filtration. It is important that outside fresh air entering in data center should have pressurization/re-circulated process in order to absorb corrosive gases and to maintain level within specified limit. It is also important that air quality monitoring should be conducted once in a year. Temperature and humidity should also be monitored as per standard practices to maintain level within specified limit.Keywords: corrosive gases, corrosion, electronic equipment failure, ASHRAE, hard disk
Procedia PDF Downloads 33022929 Development of Hybrid Materials Combining Biomass as Fique Fibers with Metal-Organic Frameworks, and Their Potential as Mercury Adsorbents
Authors: Karen G. Bastidas Gomez, Hugo R. Zea Ramirez, Manuel F. Ribeiro Pereira, Cesar A. Sierra Avila, Juan A. Clavijo Morales
Abstract:
The contamination of water sources with heavy metals such as mercury has been an environmental problem; it has generated a high impact on the environment and human health. In countries such as Colombia, mercury contamination due to mining has reached levels much higher than the world average. This work proposes the use of fique fibers as adsorbent in mercury removal. The evaluation of the material was carried out under five different conditions (raw, pretreated by organosolv, functionalized by TEMPO oxidation, fiber functionalized plus MOF-199 and fiber functionalized plus MOF-199-SH). All the materials were characterized using FTIR, SEM, EDX, XRD, and TGA. Regarding the mercury removal, it was done under room pressure and temperature, also pH = 7 for all materials presentations, followed by Atomic Absorption Spectroscopy. The high cellulose content in fique is the main particularity of this lignocellulosic biomass since the degree of oxidation depends on the number of hydroxyl groups on the surface capable of oxidizing into carboxylic acids, a functional group capable of increasing ion exchange with mercury in solution. It was also expected that the impregnation of the MOF would increase the mercury removal; however, it was found that the functionalized fique achieved a greater percentage of removal, resulting in 81.33% of removal, 44% for the fique with the MOF-199 and 72% for the MOF-199-SH with. The pretreated fiber and raw also showed 74% and 56%, respectively, which indicates that fique does not require considerable modifications in its structure to achieve good performances. Even so, the functionalized fiber increases the percentage of removal considerably compared to the pretreated fique, which suggests that the functionalization process is a feasible procedure to apply with the purpose of improving the removal percentage. In addition, this is a procedure that follows a green approach since the reagents involved have low environmental impact, and the contribution to the remediation of natural resources is high.Keywords: biomass, nanotechnology, science materials, wastewater treatment
Procedia PDF Downloads 11822928 Evaluation of Diagnosis Performance Based on Pairwise Model Construction and Filtered Data
Authors: Hyun-Woo Cho
Abstract:
It is quite important to utilize right time and intelligent production monitoring and diagnosis of industrial processes in terms of quality and safety issues. When compared with monitoring task, fault diagnosis represents the task of finding process variables responsible causing a specific fault in the process. It can be helpful to process operators who should investigate and eliminate root causes more effectively and efficiently. This work focused on the active use of combining a nonlinear statistical technique with a preprocessing method in order to implement practical real-time fault identification schemes for data-rich cases. To compare its performance to existing identification schemes, a case study on a benchmark process was performed in several scenarios. The results showed that the proposed fault identification scheme produced more reliable diagnosis results than linear methods. In addition, the use of the filtering step improved the identification results for the complicated processes with massive data sets.Keywords: diagnosis, filtering, nonlinear statistical techniques, process monitoring
Procedia PDF Downloads 24422927 A Predictive Model of Supply and Demand in the State of Jalisco, Mexico
Authors: M. Gil, R. Montalvo
Abstract:
Business Intelligence (BI) has become a major source of competitive advantages for firms around the world. BI has been defined as the process of data visualization and reporting for understanding what happened and what is happening. Moreover, BI has been studied for its predictive capabilities in the context of trade and financial transactions. The current literature has identified that BI permits managers to identify market trends, understand customer relations, and predict demand for their products and services. This last capability of BI has been of special concern to academics. Specifically, due to its power to build predictive models adaptable to specific time horizons and geographical regions. However, the current literature of BI focuses on predicting specific markets and industries because the impact of such predictive models was relevant to specific industries or organizations. Currently, the existing literature has not developed a predictive model of BI that takes into consideration the whole economy of a geographical area. This paper seeks to create a predictive model of BI that would show the bigger picture of a geographical area. This paper uses a data set from the Secretary of Economic Development of the state of Jalisco, Mexico. Such data set includes data from all the commercial transactions that occurred in the state in the last years. By analyzing such data set, it will be possible to generate a BI model that predicts supply and demand from specific industries around the state of Jalisco. This research has at least three contributions. Firstly, a methodological contribution to the BI literature by generating the predictive supply and demand model. Secondly, a theoretical contribution to BI current understanding. The model presented in this paper incorporates the whole picture of the economic field instead of focusing on a specific industry. Lastly, a practical contribution might be relevant to local governments that seek to improve their economic performance by implementing BI in their policy planning.Keywords: business intelligence, predictive model, supply and demand, Mexico
Procedia PDF Downloads 12322926 A New Block Cipher for Resource-Constrained Internet of Things Devices
Authors: Muhammad Rana, Quazi Mamun, Rafiqul Islam
Abstract:
In the Internet of Things (IoT), many devices are connected and accumulate a sheer amount of data. These Internet-driven raw data need to be transferred securely to the end-users via dependable networks. Consequently, the challenges of IoT security in various IoT domains are paramount. Cryptography is being applied to secure the networks for authentication, confidentiality, data integrity and access control. However, due to the resource constraint properties of IoT devices, the conventional cipher may not be suitable in all IoT networks. This paper designs a robust and effective lightweight cipher to secure the IoT environment and meet the resource-constrained nature of IoT devices. We also propose a symmetric and block-cipher based lightweight cryptographic algorithm. The proposed algorithm increases the complexity of the block cipher, maintaining the lowest computational requirements possible. The proposed algorithm efficiently constructs the key register updating technique, reduces the number of encryption rounds, and adds a new layer between the encryption and decryption processes.Keywords: internet of things, cryptography block cipher, S-box, key management, security, network
Procedia PDF Downloads 11322925 BodeACD: Buffer Overflow Vulnerabilities Detecting Based on Abstract Syntax Tree, Control Flow Graph, and Data Dependency Graph
Authors: Xinghang Lv, Tao Peng, Jia Chen, Junping Liu, Xinrong Hu, Ruhan He, Minghua Jiang, Wenli Cao
Abstract:
As one of the most dangerous vulnerabilities, effective detection of buffer overflow vulnerabilities is extremely necessary. Traditional detection methods are not accurate enough and consume more resources to meet complex and enormous code environment at present. In order to resolve the above problems, we propose the method for Buffer overflow detection based on Abstract syntax tree, Control flow graph, and Data dependency graph (BodeACD) in C/C++ programs with source code. Firstly, BodeACD constructs the function samples of buffer overflow that are available on Github, then represents them as code representation sequences, which fuse control flow, data dependency, and syntax structure of source code to reduce information loss during code representation. Finally, BodeACD learns vulnerability patterns for vulnerability detection through deep learning. The results of the experiments show that BodeACD has increased the precision and recall by 6.3% and 8.5% respectively compared with the latest methods, which can effectively improve vulnerability detection and reduce False-positive rate and False-negative rate.Keywords: vulnerability detection, abstract syntax tree, control flow graph, data dependency graph, code representation, deep learning
Procedia PDF Downloads 17022924 Fueling Efficient Reporting And Decision-Making In Public Health With Large Data Automation In Remote Areas, Neno Malawi
Authors: Wiseman Emmanuel Nkhomah, Chiyembekezo Kachimanga, Julia Huggins, Fabien Munyaneza
Abstract:
Background: Partners In Health – Malawi introduced one of Operational Researches called Primary Health Care (PHC) Surveys in 2020, which seeks to assess progress of delivery of care in the district. The study consists of 5 long surveys, namely; Facility assessment, General Patient, Provider, Sick Child, Antenatal Care (ANC), primarily conducted in 4 health facilities in Neno district. These facilities include Neno district hospital, Dambe health centre, Chifunga and Matope. Usually, these annual surveys are conducted from January, and the target is to present final report by June. Once data is collected and analyzed, there are a series of reviews that take place before reaching final report. In the first place, the manual process took over 9 months to present final report. Initial findings reported about 76.9% of the data that added up when cross-checked with paper-based sources. Purpose: The aim of this approach is to run away from manually pulling the data, do fresh analysis, and reporting often associated not only with delays in reporting inconsistencies but also with poor quality of data if not done carefully. This automation approach was meant to utilize features of new technologies to create visualizations, reports, and dashboards in Power BI that are directly fished from the data source – CommCare hence only require a single click of a ‘refresh’ button to have the updated information populated in visualizations, reports, and dashboards at once. Methodology: We transformed paper-based questionnaires into electronic using CommCare mobile application. We further connected CommCare Mobile App directly to Power BI using Application Program Interface (API) connection as data pipeline. This provided chance to create visualizations, reports, and dashboards in Power BI. Contrary to the process of manually collecting data in paper-based questionnaires, entering them in ordinary spreadsheets, and conducting analysis every time when preparing for reporting, the team utilized CommCare and Microsoft Power BI technologies. We utilized validations and logics in CommCare to capture data with less errors. We utilized Power BI features to host the reports online by publishing them as cloud-computing process. We switched from sharing ordinary report files to sharing the link to potential recipients hence giving them freedom to dig deep into extra findings within Power BI dashboards and also freedom to export to any formats of their choice. Results: This data automation approach reduced research timelines from the initial 9 months’ duration to 5. It also improved the quality of the data findings from the original 76.9% to 98.9%. This brought confidence to draw conclusions from the findings that help in decision-making and gave opportunities for further researches. Conclusion: These results suggest that automating the research data process has the potential of reducing overall amount of time spent and improving the quality of the data. On this basis, the concept of data automation should be taken into serious consideration when conducting operational research for efficiency and decision-making.Keywords: reporting, decision-making, power BI, commcare, data automation, visualizations, dashboards
Procedia PDF Downloads 11622923 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models
Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah
Abstract:
In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model
Procedia PDF Downloads 24222922 Detecting Black Hole Attacks in Body Sensor Networks
Authors: Sara Alshehri, Bayan Alenzi, Atheer Alshehri, Samia Chelloug, Zainab Almry, Hussah Albugmai
Abstract:
This paper concerns body area networks sensor that collect signals around a human body. The black hole attacks are the main security challenging problem because the data traffic can be dropped at any node. The focus of our proposed solution is to efficiently route data packets while detecting black hole nodes.Keywords: body sensor networks, security, black hole, routing, broadcasting, OMNeT++
Procedia PDF Downloads 64522921 Data Analytics of Electronic Medical Records Shows an Age-Related Differences in Diagnosis of Coronary Artery Disease
Authors: Maryam Panahiazar, Andrew M. Bishara, Yorick Chern, Roohallah Alizadehsani, Dexter Hadleye, Ramin E. Beygui
Abstract:
Early detection plays a crucial role in enhancing the outcome for a patient with coronary artery disease (CAD). We utilized a big data analytics platform on ~23,000 patients with CAD from a total of 960,129 UCSF patients in 8 years. We traced the patients from their first encounter with a physician to diagnose and treat CAD. Characteristics such as demographic information, comorbidities, vital, lab tests, medications, and procedures are included. There are statistically significant gender-based differences in patients younger than 60 years old from the time of the first physician encounter to coronary artery bypass grafting (CABG) with a p-value=0.03. There are no significant differences between the patients between 60 and 80 years old (p-value=0.8) and older than 80 (p-value=0.4) with a 95% confidence interval. This recognition would affect significant changes in the guideline for referral of the patients for diagnostic tests expeditiously to improve the outcome by avoiding the delay in treatment.Keywords: electronic medical records, coronary artery disease, data analytics, young women
Procedia PDF Downloads 14822920 Studying the Effectiveness of Using Narrative Animation on Students’ Understanding of Complex Scientific Concepts
Authors: Atoum Abdullah
Abstract:
The purpose of this research is to determine the extent to which computer animation and narration affect students’ understanding of complex scientific concepts and improve their exam performance, this is compared to traditional lectures that include PowerPoints with texts and static images. A mixed-method design in data collection was used, including quantitative and qualitative data. Quantitative data was collected using a pre and post-test method and a close-ended questionnaire. Qualitative data was collected through an open-ended questionnaire. A pre and posttest strategy was used to measure the level of students’ understanding with and without the use of animation. The test included multiple-choice questions to test factual knowledge, open-ended questions to test conceptual knowledge, and to label the diagram questions to test application knowledge. The results showed that students on average, performed significantly higher on the posttest as compared to the pretest on all areas of acquired knowledge. However, the increase in the posttest score with respect to the acquisition of conceptual and application knowledge was higher compared to the increase in the posttest score with respect to the acquisition of factual knowledge. This result demonstrates that animation is more beneficial when acquiring deeper, conceptual, and cognitive knowledge than when only factual knowledge is acquired.Keywords: animation, narration, science, teaching
Procedia PDF Downloads 17022919 Asymmetrical Informative Estimation for Macroeconomic Model: Special Case in the Tourism Sector of Thailand
Authors: Chukiat Chaiboonsri, Satawat Wannapan
Abstract:
This paper used an asymmetric informative concept to apply in the macroeconomic model estimation of the tourism sector in Thailand. The variables used to statistically analyze are Thailand international and domestic tourism revenues, the expenditures of foreign and domestic tourists, service investments by private sectors, service investments by the government of Thailand, Thailand service imports and exports, and net service income transfers. All of data is a time-series index which was observed between 2002 and 2015. Empirically, the tourism multiplier and accelerator were estimated by two statistical approaches. The first was the result of the Generalized Method of Moments model (GMM) based on the assumption which the tourism market in Thailand had perfect information (Symmetrical data). The second was the result of the Maximum Entropy Bootstrapping approach (MEboot) based on the process that attempted to deal with imperfect information and reduced uncertainty in data observations (Asymmetrical data). In addition, the tourism leakages were investigated by a simple model based on the injections and leakages concept. The empirical findings represented the parameters computed from the MEboot approach which is different from the GMM method. However, both of the MEboot estimation and GMM model suggests that Thailand’s tourism sectors are in a period capable of stimulating the economy.Keywords: TThailand tourism, Maximum Entropy Bootstrapping approach, macroeconomic model, asymmetric information
Procedia PDF Downloads 29522918 Qualitative Study of Pre-Service Teachers' Imagined Professional World vs. Real Experiences of In-Service Teachers
Authors: Masood Monjezi
Abstract:
The English teachers’ pedagogical identity construction is the way teachers go through the process of becoming teachers and how they maintain their teaching selves. The pedagogical identity of teachers is influenced by several factors within the individual and the society. The purpose of this study was to compare the imagined social world of the pre-service teachers with the real experiences the in-service teachers had in the context of Iran to see how prepared the pre-service teachers are with a view to their identity being. This study used a qualitative approach to collection and analysis of the data. Structured and semi-structured interviews, focus groups and process logs were used to collect the data. Then, using open coding, the data were analyzed. The findings showed that the imagined world of the pre-service teachers partly corresponded with the real world experiences of the in-service teachers leaving the pre-service teachers unprepared for their real world teaching profession. The findings suggest that the current approaches to English teacher training are in need of modification to better prepare the pre-service teachers for the future that expects them.Keywords: imagined professional world, in-service teachers, pre-service teachers, real experiences, community of practice, identity
Procedia PDF Downloads 33622917 Proposing an Optimal Pattern for Evaluating the Performance of the Staff Management of the Water and Sewage Organization in Western Azerbaijan Province, Iran
Authors: Tohid Eskandarzadeh, Nader Bahlouli, Turaj Behnam, Azra Jafarzadeh
Abstract:
The purpose of the study reported in this paper was to propose an optimal pattern to evaluate the staff management performance of the water and sewage organization. The performance prism-model was used to evaluate the following significant dimensions of performance: organizational strategies, organizational processes, organization capabilities, stakeholders’ partnership and satisfaction. In the present study, a standard, valid and reliable questionnaire was used to obtain data about the five dimensions of the performance prism model. 169 sample respondents were used for responding the questionnaire who were selected from the staff of water and waste-water organization in western Azerbaijan, Iran. Also, Alpha coefficient was used to check the reliability of the data-collection instrument which was measured to be beyond 0.7. The obtained data were statistically analyzed by means of SPSS version 18. The results obtained from the data analysis indicated that the performance of the staff management of the water and waste-water organization in western Azerbaijan was acceptable in terms of organizational strategies, organizational process, stakeholders’ partnership and satisfaction. Nevertheless, it was found that the performance of the staff management with respect to organizational abilities was average. Indeed, the researchers drew the conclusion that the current performance of the staff management in this organization in western Azerbaijan was less than ideal performance.Keywords: performance evaluation, performance prism model, water, waste-water organization
Procedia PDF Downloads 32822916 The Nutrient Foramen of the Scaphoid Bone – A Morphological Study
Authors: B. V. Murlimanju, P. J. Jiji, Latha V. Prabhu, Mangala M. Pai
Abstract:
Background: The scaphoid is the most commonly fractured bone of the wrist. The fracture may disrupt the vessels and end up as the avascular necrosis of the bone. The objective of the present study was to investigate the morphology and number of the nutrient foramina in the cadaveric dried scaphoid bones of the Indian population. Methods: The present study included 46 scaphoid bones (26 right sided and 20 left sided) which were obtained from the gross anatomy laboratory of our institution. The bones were macroscopically observed for the nutrient foramina and the data was collected with respect to their number. The tabulation of the data and analysis were done. Results: All of our specimens (100%) exhibited the nutrient foramina over the non-articular surfaces. The foramina were observed only over the palmar and dorsal surfaces of the scaphoid bones. The foramina were observed both proximal and distal to the mid waist of the scaphoid bone. The foramen ranged between 9 and 54 in each scaphoid bone. The foramina over the palmar surface ranged between, 2-24 in number. The foramina over the dorsal surface ranged between, 7-36 in number. The foramina proximal to the waist ranged between 2 and 24 in number and distal to the waist ranged between 3 and 39. Conclusion: We believe that the present study has provided additional data about the nutrient foramina of the scaphoid bones. The data is enlightening to the orthopedic surgeon and would help in the hand surgeries. The morphological knowledge of the vasculature, their foramina of entry and their number is required to understand the concepts in the avascular necrosis of the proximal scaphoid and non-union of the fracture at the waist of the scaphoid.Keywords: avascular necrosis, nutrient, scaphoid, vascular
Procedia PDF Downloads 34422915 Overview of Development of a Digital Platform for Building Critical Infrastructure Protection Systems in Smart Industries
Authors: Bruno Vilić Belina, Ivan Župan
Abstract:
Smart industry concepts and digital transformation are very popular in many industries. They develop their own digital platforms, which have an important role in innovations and transactions. The main idea of smart industry digital platforms is central data collection, industrial data integration, and data usage for smart applications and services. This paper presents the development of a digital platform for building critical infrastructure protection systems in smart industries. Different service contraction modalities in service level agreements (SLAs), customer relationship management (CRM) relations, trends, and changes in business architectures (especially process business architecture) for the purpose of developing infrastructural production and distribution networks, information infrastructure meta-models and generic processes by critical infrastructure owner demanded by critical infrastructure law, satisfying cybersecurity requirements and taking into account hybrid threats are researched.Keywords: cybersecurity, critical infrastructure, smart industries, digital platform
Procedia PDF Downloads 10622914 Innovation in Traditional Game: A Case Study of Trainee Teachers' Learning Experiences
Authors: Malathi Balakrishnan, Cheng Lee Ooi, Chander Vengadasalam
Abstract:
The purpose of this study is to explore a case study of trainee teachers’ learning experience on innovating traditional games during the traditional game carnival. It explores issues arising from multiple case studies of trainee teachers learning experiences in innovating traditional games. A qualitative methodology was adopted through observations, semi-structured interviews and reflective journals’ content analysis of trainee teachers’ learning experiences creating and implementing innovative traditional games. Twelve groups of 36 trainee teachers who registered for Sports and Physical Education Management Course were the participants for this research during the traditional game carnival. Semi structured interviews were administrated after the trainee teachers learning experiences in creating innovative traditional games. Reflective journals were collected after carnival day and the content analyzed. Inductive data analysis was used to evaluate various data sources. All the collected data were then evaluated through the Nvivo data analysis process. Inductive reasoning was interpreted based on the Self Determination Theory (SDT). The findings showed that the trainee teachers had positive game participation experiences, game knowledge about traditional games and positive motivation to innovate the game. The data also revealed the influence of themes like cultural significance and creativity. It can be concluded from the findings that the organized game carnival, as a requirement of course work by the Institute of Teacher Training Malaysia, was able to enhance teacher trainers’ innovative thinking skills. The SDT, as a multidimensional approach to motivation, was utilized. Therefore, teacher trainers may have more learning experiences using the SDT.Keywords: learning experiences, innovation, traditional games, trainee teachers
Procedia PDF Downloads 33022913 Trend Analysis for Extreme Rainfall Events in New South Wales, Australia
Authors: Evan Hajani, Ataur Rahman, Khaled Haddad
Abstract:
Climate change will affect the hydrological cycle in many different ways such as increase in evaporation and rainfalls. There have been growing interests among researchers to identify the nature of trends in historical rainfall data in many different parts of the world. This paper examines the trends in annual maximum rainfall data from 30 stations in New South Wales, Australia by using two non-parametric tests, Mann-Kendall (MK) and Spearman’s Rho (SR). Rainfall data were analyzed for fifteen different durations ranging from 6 min to 3 days. It is found that the sub-hourly durations (6, 12, 18, 24, 30, and 48 minutes) show statistically significant positive (upward) trends whereas longer duration (sub-daily and daily) events generally show a statistically significant negative (downward) trend. It is also found that the MK test and SR test provide notably different results for some rainfall event durations considered in this study. Since shorter duration sub-hourly rainfall events show positive trends at many stations, the design rainfall data based on stationary frequency analysis for these durations need to be adjusted to account for the impact of climate change. These shorter durations are more relevant to many urban development projects based on smaller catchments having a much shorter response time.Keywords: climate change, Mann-Kendall test, Spearman’s Rho test, trends, design rainfall
Procedia PDF Downloads 27122912 The Role of Japan's Land-Use Planning in Farmland Conservation: A Statistical Study of Tokyo Metropolitan District
Authors: Ruiyi Zhang, Wanglin Yan
Abstract:
Strict land-use plan is issued based on city planning act for controlling urbanization and conserving semi-natural landscape. And the agrarian land resource in the suburbs has indispensable socio-economic value and contributes to the sustainability of the regional environment. However, the agrarian hinterland of metropolitan is witnessing severe farmland conversion and abandonment, while the contribution of land-use planning to farmland conservation remains unclear in those areas. Hypothetically, current land-use plan contributes to farmland loss. So, this research investigated the relationship between farmland loss and land-use planning at municipality level to provide base data for zoning in the metropolitan suburbs, and help to develop a sustainable land-use plan that will conserve the agrarian hinterland. As data and methods, 1) Farmland data of Census of Agriculture and Forestry for 2005 to 2015 and population data of 2015 and 2018 were used to investigate spatial distribution feathers of farmland loss in Tokyo Metropolitan District (TMD) for two periods: 2005-2010;2010-2015. 2) And the samples were divided by four urbanization facts. 3) DID data and zoning data for 2006 to 2018 were used to specify urbanization level of zones for describing land-use plan. 4) Then we conducted multiple regression between farmland loss, both abandonment and conversion amounts, and the described land-use plan in each of the urbanization scenario and in each period. As the results, the study reveals land-use plan has unignorable relation with farmland loss in the metropolitan suburbs at ward-city-town-village level. 1) The urban promotion areas planned larger than necessity and unregulated urbanization promote both farmland conversion and abandonment, and the effect weakens from inner suburbs to outer suburbs. 2) And the effect of land-use plan on farmland abandonment is more obvious than that on farmland conversion. The study advocates that, optimizing land-use plan will hopefully help the farmland conservation in metropolitan suburbs, which contributes to sustainable regional policy making.Keywords: Agrarian land resource, land-use planning, urbanization level, multiple regression
Procedia PDF Downloads 15022911 Using a Robot Companion to Detect and Visualize the Indicators of Dementia Progression and Quality of Life of People Aged 65 and Older
Authors: Jeoffrey Oostrom, Robbert James Schlingmann, Hani Alers
Abstract:
This document depicts the research into the indicators of dementia progression, the automation of quality of life assignments, and the visualization of it. To do this, the Smart Teddy project was initiated to make a smart companion that both monitors the senior citizen as well as processing the captured data into an insightful dashboard. With around 50 million diagnoses worldwide, dementia proves again and again to be a bothersome strain on the lives of many individuals, their relatives, and society as a whole. In 2015 it was estimated that dementia care cost 818 billion U.S Dollars globally. The Smart Teddy project aims to take away a portion of the burden from caregivers by automating the collection of certain data, like movement, geolocation, and sound-levels. This paper proves that the Smart Teddy has the potential to become a useful tool for caregivers but won’t pose as a solution. The Smart Teddy still faces some problems in terms of emotional privacy, but its non-intrusive nature, as well as diversity in usability, can make up for it.Keywords: dementia care, medical data visualization, quality of life, smart companion
Procedia PDF Downloads 14022910 The Social Aspects of Code-Switching in Online Interaction: The Case of Saudi Bilinguals
Authors: Shirin Alabdulqader
Abstract:
This research aims to investigate the concept of code-switching (CS) between English, Arabic, and the CS practices of Saudi online users via a Translanguaging (TL) lens for more inclusive view towards the nature of the data from the study. It employs Digitally Mediated Communication (DMC), specifically the WhatsApp and Twitter platforms, in order to understand how the users employ online resources to communicate with others on a daily basis. This project looks beyond language and considers the multimodal affordances (visual and audio means) that interlocutors utilise in their online communicative practices to shape their online social existence. This exploratory study is based on a data-driven interpretivist epistemology as it aims to understand how meaning (reality) is created by individuals within different contexts. This project used a mixed-method approach, combining a qualitative and a quantitative approach. In the former, data were collected from online chats and interview responses, while in the latter a questionnaire was employed to understand the frequency and relations between the participants’ linguistic and non-linguistic practices and their social behaviours. The participants were eight bilingual Saudi nationals (both men and women, aged between 20 and 50 years old) who interacted with others online. These participants provided their online interactions, participated in an interview and responded to a questionnaire. The study data were gathered from 194 WhatsApp chats and 122 Tweets. These data were analysed and interpreted according to three levels: conversational turn taking and CS; the linguistic description of the data; and CS and persona. This project contributes to the emerging field of analysing online Arabic data systematically, and the field of multimodality and bilingual sociolinguistics. The findings are reported for each of the three levels. For conversational turn taking, the CS analysis revealed that it was used to accomplish negotiation and develop meaning in the conversation. With regard to the linguistic practices of the CS data, the majority of the code-switched words were content morphemes. The third level of data interpretation is CS and its relationship with identity; two types of identity were indexed; absolute identity and contextual identity. This study contributes to the DMC literature and bridges some of the existing gaps. The findings of this study are that CS by its nature, and most of the findings, if not all, support the notion of TL that multiliteracy is one’s ability to decode multimodal communication, and that this multimodality contributes to the meaning. Either this is applicable to the online affordances used by monolinguals or multilinguals and perceived not only by specific generations but also by any online multiliterates, the study provides the linguistic features of CS utilised by Saudi bilinguals and it determines the relationship between these features and the contexts in which they appear.Keywords: social media, code-switching, translanguaging, online interaction, saudi bilinguals
Procedia PDF Downloads 13122909 Developing a Deep Understanding of the Immune Response in Hepatitis B Virus Infected Patients Using a Knowledge Driven Approach
Authors: Hanan Begali, Shahi Dost, Annett Ziegler, Markus Cornberg, Maria-Esther Vidal, Anke R. M. Kraft
Abstract:
Chronic hepatitis B virus (HBV) infection can be treated with nucleot(s)ide analog (NA), for example, which inhibits HBV replication. However, they have hardly any influence on the functional cure of HBV, which is defined by hepatitis B surface antigen (HBsAg) loss. NA needs to be taken life-long, which is not available for all patients worldwide. Additionally, NA-treated patients are still at risk of developing cirrhosis, liver failure, or hepatocellular carcinoma (HCC). Although each patient has the same components of the immune system, immune responses vary between patients. Therefore, a deeper understanding of the immune response against HBV in different patients is necessary to understand the parameters leading to HBV cure and to use this knowledge to optimize HBV therapies. This requires seamless integration of an enormous amount of diverse and fine-grained data from viral markers, e.g., hepatitis B core-related antigen (HBcrAg) and hepatitis B surface antigen (HBsAg). The data integration system relies on the assumption that profiling human immune systems requires the analysis of various variables (e.g., demographic data, treatments, pre-existing conditions, immune cell response, or HLA-typing) rather than only one. However, the values of these variables are collected independently. They are presented in a myriad of formats, e.g., excel files, textual descriptions, lab book notes, and images of flow cytometry dot plots. Additionally, patients can be identified differently in these analyses. This heterogeneity complicates the integration of variables, as data management techniques are needed to create a unified view in which individual formats and identifiers are transparent when profiling the human immune systems. The proposed study (HBsRE) aims at integrating heterogeneous data sets of 87 chronically HBV-infected patients, e.g., clinical data, immune cell response, and HLA-typing, with knowledge encoded in biomedical ontologies and open-source databases into a knowledge-driven framework. This new technique enables us to harmonize and standardize heterogeneous datasets in the defined modeling of the data integration system, which will be evaluated in the knowledge graph (KG). KGs are data structures that represent the knowledge and data as factual statements using a graph data model. Finally, the analytic data model will be applied on top of KG in order to develop a deeper understanding of the immune profiles among various patients and to evaluate factors playing a role in a holistic profile of patients with HBsAg level loss. Additionally, our objective is to utilize this unified approach to stratify patients for new effective treatments. This study is developed in the context of the project “Transforming big data into knowledge: for deep immune profiling in vaccination, infectious diseases, and transplantation (ImProVIT)”, which is a multidisciplinary team composed of computer scientists, infection biologists, and immunologists.Keywords: chronic hepatitis B infection, immune response, knowledge graphs, ontology
Procedia PDF Downloads 10822908 Optimization of Beneficiation Process for Upgrading Low Grade Egyptian Kaolin
Authors: Nagui A. Abdel-Khalek, Khaled A. Selim, Ahmed Hamdy
Abstract:
Kaolin is naturally occurring ore predominantly containing kaolinite mineral in addition to some gangue minerals. Typical impurities present in kaolin ore are quartz, iron oxides, titanoferrous minerals, mica, feldspar, organic matter, etc. The main coloring impurity, particularly in the ultrafine size range, is titanoferrous minerals. Kaolin is used in many industrial applications such as sanitary ware, table ware, ceramic, paint, and paper industries, each of which should be of certain specifications. For most industrial applications, kaolin should be processed to obtain refined clay so as to match with standard specifications. For example, kaolin used in paper and paint industries need to be of high brightness and low yellowness. Egyptian kaolin is not subjected to any beneficiation process and the Egyptian companies apply selective mining followed by, in some localities, crushing and size reduction only. Such low quality kaolin can be used in refractory and pottery production but not in white ware and paper industries. This paper aims to study the amenability of beneficiation of an Egyptian kaolin ore of El-Teih locality, Sinai, to be suitable for different industrial applications. Attrition scrubbing and classification followed by magnetic separation are applied to remove the associated impurities. Attrition scrubbing and classification are used to separate the coarse silica and feldspars. Wet high intensity magnetic separation was applied to remove colored contaminants such as iron oxide and titanium oxide. Different variables affecting of magnetic separation process such as solid percent, magnetic field, matrix loading capacity, and retention time are studied. The results indicated that substantial decrease in iron oxide (from 1.69% to 0.61% ) and TiO2 (from 3.1% to 0.83%) contents as well as improving iso-brightness (from 63.76% to 75.21% and whiteness (from 79.85% to 86.72%) of the product can be achieved.Keywords: Kaolin, titanoferrous minerals, beneficiation, magnetic separation, attrition scrubbing, classification
Procedia PDF Downloads 36122907 The Challenge of Characterising Drought Risk in Data Scarce Regions: The Case of the South of Angola
Authors: Natalia Limones, Javier Marzo, Marcus Wijnen, Aleix Serrat-Capdevila
Abstract:
In this research we developed a structured approach for the detection of areas under the highest levels of drought risk that is suitable for data-scarce environments. The methodology is based on recent scientific outcomes and methods and can be easily adapted to different contexts in successive exercises. The research reviews the history of drought in the south of Angola and characterizes the experienced hazard in the episode from 2012, focusing on the meteorological and the hydrological drought types. Only global open data information coming from modeling or remote sensing was used for the description of the hydroclimatological variables since there is almost no ground data in this part of the country. Also, the study intends to portray the socioeconomic vulnerabilities and the exposure to the phenomenon in the region to fully understand the risk. As a result, a map of the areas under the highest risk in the south of the country is produced, which is one of the main outputs of this work. It was also possible to confirm that the set of indicators used revealed different drought vulnerability profiles in the South of Angola and, as a result, several varieties of priority areas prone to distinctive impacts were recognized. The results demonstrated that most of the region experienced a severe multi-year meteorological drought that triggered an unprecedent exhaustion of the surface water resources, and that the majority of their socioeconomic impacts started soon after the identified onset of these processes.Keywords: drought risk, exposure, hazard, vulnerability
Procedia PDF Downloads 19122906 Sustainability in Hospitality: An Inevitable Necessity in New Age with Big Environmental Challenges
Authors: Majid Alizadeh, Sina Nematizadeh, Hassan Esmailpour
Abstract:
The mutual effects of hospitality and the environment are undeniable, so that the tourism industry has major harmful effects on the environment. Hotels, as one of the most important pillars of the hospitality industry, have significant effects on the environment. Green marketing is a promising strategy in response to the growing concerns about the environment. A green hotel marketing model was proposed using a grounded theory approach in the hotel industry. The study was carried out as a mixed method study. Data gathering in the qualitative phase was done through literature review and In-depth, semi-structured interviews with 10 experts in green marketing using snowball technique. Following primary analysis, open, axial, and selective coding was done on the data, which yielded 69 concepts, 18 categories and six dimensions. Green hotel (green product) was adopted as the core phenomenon. In the quantitative phase, data were gleaned using 384 questionnaires filled-out by hotel guests and descriptive statistics and Structural equation modeling (SEM) were used for data analysis. The results indicated that the mediating role of behavioral response between the ecological literacy, trust, marketing mix and performance was significant. The green marketing mix, as a strategy, had a significant and positive effect on guests’ behavioral response, corporate green image, and financial and environmental performance of hotels.Keywords: green marketing, sustainable development, hospitality, grounded theory, structural equations model
Procedia PDF Downloads 81