Abstracts | Computer and Information Engineering
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3580

World Academy of Science, Engineering and Technology

[Computer and Information Engineering]

Online ISSN : 1307-6892

3580 The Design Method of Artificial Intelligence Learning Picture: A Case Study of DCAI's New Teaching

Authors: Weichen Chang

Abstract:

To create a guided teaching method for AI generative drawing design, this paper develops a set of teaching models for AI generative drawing (DCAI), which combines learning modes such as problem-solving, thematic inquiry, phenomenon-based, task-oriented, and DFC . Through the information security AI picture book learning guided programs and content, the application of participatory action research (PAR) and interview methods to explore the dual knowledge of Context and ChatGPT (DCAI) for AI to guide the development of students' AI learning skills. In the interviews, the students highlighted five main learning outcomes (self-study, critical thinking, knowledge generation, cognitive development, and presentation of work) as well as the challenges of implementing the model. Through the use of DCAI, students will enhance their consensus awareness of generative mapping analysis and group cooperation, and they will have knowledge that can enhance AI capabilities in DCAI inquiry and future life. From this paper, it is found that the conclusions are (1) The good use of DCAI can assist students in exploring the value of their knowledge through the power of stories and finding the meaning of knowledge communication; (2) Analyze the transformation power of the integrity and coherence of the story through the context so as to achieve the tension of ‘starting and ending’; (3) Use ChatGPT to extract inspiration, arrange story compositions, and make prompts that can communicate with people and convey emotions. Therefore, new knowledge construction methods will be one of the effective methods for AI learning in the face of artificial intelligence, providing new thinking and new expressions for interdisciplinary design and design education practice.

Keywords: artificial intelligence, task-oriented, contextualization, design education

Procedia PDF Downloads 0
3579 Assesing Customer Relationship Management Practice in the Case of Wegagen Bank of Ethiopia

Authors: Rina G/Micheal

Abstract:

The objective of this study is to examine the practice of CRM application in Wegagen bank. A quantitative approach was used with a descriptive design. Both primary and secondary sources were used to gather data’s based on the six dimensions of CRM (customer acquisition, customer response, customer knowledge, customer information system, customer value evaluation and customer information process). The study investigates customers' and employees' perceptions of CRM practices of selected Wegagen banks in Addis Ababa. The study data was collected with a Sample size of 109 with purposive sampling from tier 1 branches(Teklhaymanot branch, Beklobetbranch, Gofa branch, Bolebranch, and Meskel Square branch) of Wegagen bank. The study shows the practice of CRM application in Wegagen Bank is at the average level, with the practice of application of the customer knowledge dimension being the highest in achievement while the customer information process practices are insufficient; therefore, it suggested that Wegagen Bank should keep working more on the customer knowledge and on the customer information process the bank should have a system that can make easier for the customers to do a business with the bank by using updated technologies that can make all processes easier also the bank should use computer system for recording the customers‘ requests and service rendered in order to bit the stiff competition and achieve its goals.

Keywords: CRM, customer acquisition, customer response customer, knowledge, customer information system, customer value evaluation, customer information process

Procedia PDF Downloads 0
3578 Spatial Pattern and Predictors of Malaria in Ethiopia: Application of Auto Logistics Spatial Regression

Authors: Melkamu A. Zeru, Yamral M. Warkaw, Aweke A. Mitku, Muluwerk Ayele

Abstract:

Introduction: Malaria is a severe health threat in the World, mainly in Africa. It is the major cause of health problems in which the risk of morbidity and mortality associated with malaria cases are characterized by spatial variations across the county. This study aimed to investigate the spatial patterns and predictors of malaria distribution in Ethiopia. Methods: A weighted sample of 15,239 individuals with rapid diagnosis tests was obtained from the Central Statistical Agency and Ethiopia malaria indicator survey of 2015. Global Moran's I and Moran scatter plots were used in determining the distribution of malaria cases, whereas the local Moran's I statistic was used in identifying exposed areas. In data manipulation, machine learning was used for variable reduction and statistical software R, Stata, and Python were used for data management and analysis. The auto logistics spatial binary regression model was used to investigate the predictors of malaria. Results: The final auto logistics regression model reported that male clients had a positive significant effect on malaria cases as compared to female clients [AOR=2.401, 95 % CI: (2.125 - 2.713)]. The distribution of malaria across the regions was different. The highest incidence of malaria was found in Gambela [AOR=52.55, 95%CI: (40.54-68.12)] followed by Beneshangul [AOR=34.95, 95%CI: (27.159 - 44.963)]. Similarly, individuals in Amhara [AOR=0.243, 95% CI:(0.1950.303],Oromiya[AOR=0.197,95%CI:(0.1580.244)],DireDawa[AOR=0.064,95%CI(0.049-0.082)],AddisAbaba[AOR=0.057,95%CI:(0.044-0.075)], Somali[AOR=0.077,95%CI:(0.059-0.097)], SNNPR[OR=0.329, 95%CI: (0.261- 0.413)] and Harari [AOR=0.256, 95%CI:(0.201 - 0.325)] were less likely to had low incidence of malaria as compared with Tigray. Furthermore, for a one-meter increase in altitude, the odds of a positive rapid diagnostic test (RDT) decrease by 1.6% [AOR = 0.984, 95% CI :( 0.984 - 0.984)]. The use of a shared toilet facility was found as a protective factor for malaria in Ethiopia [AOR=1.671, 95% CI: (1.504 - 1.854)]. The spatial autocorrelation variable changes the constant from AOR = 0.471 for logistic regression to AOR = 0.164 for auto logistics regression. Conclusions: This study found that the incidence of malaria in Ethiopia had a spatial pattern that is associated with socio-economic, demographic, and geographic risk factors. Spatial clustering of malaria cases had occurred in all regions, and the risk of clustering was different across the regions. The risk of malaria was found to be higher for those who live in soil floor-type houses as compared to those who live in cement or ceramics floor type. Similarly, households with thatched, metal and thin, and other roof-type houses have a higher risk of malaria than ceramic tiles roof houses. Moreover, using a protected anti-mosquito net reduced the risk of malaria incidence.

Keywords: malaria, Ethiopia, auto logistics, spatial model, spatial clustering

Procedia PDF Downloads 0
3577 The Impact of ChatGPT on the Healthcare Domain: Perspectives from Healthcare Majors

Authors: Su Yen Chen

Abstract:

Extensive research on ChatGPT has revealed its capabilities and limitations across various clinical, educational, and research contexts, emphasizing crucial issues such as accuracy, transparency, and ethical usage. Studies applying the Technology Acceptance Model (TAM) and Uses and Gratifications Theory have deepened our understanding of the factors that drive user acceptance and satisfaction of ChatGPT for general use. These insights are particularly valuable for examining healthcare-specific behaviors, trust levels, and perceived risks. Despite these advancements, there remains a notable gap in our understanding of how general user perceptions of AI translate into its practical applications within the healthcare sector. This study focuses on examining the perceptions of ChatGPT's impact among 266 healthcare majors in Taiwan, exploring its implications for their career development and utility in clinical practice, medical education, and research. By employing a structured survey with precisely defined subscales, this research aims to probe the breadth of ChatGPT's applications within healthcare, assessing both the perceived benefits and the challenges it presents. The findings from the survey reveal that perceptions and usage of ChatGPT among healthcare majors vary significantly, influenced by factors such as its perceived utility, risk, novelty, and trustworthiness. Graduate students and those who perceive ChatGPT as more beneficial and less risky are particularly inclined to use it more frequently. This increased usage is closely linked to significant impacts on personal career development. Furthermore, ChatGPT's perceived usefulness contributes to its broader impact within the healthcare domain, suggesting that both innovation and practical utility are key drivers of acceptance and perceived effectiveness in professional healthcare settings. Trust emerges as an important factor, especially in clinical settings where the stakes are high. The trust that healthcare professionals place in ChatGPT significantly affects its integration into clinical practice and influences outcomes in medical education and research. Thus, ChatGPT's reliability and practical value are critical for its successful adoption in these areas. However, an interesting paradox arises with regard to ease of use. While making ChatGPT more user-friendly is generally seen as beneficial, it also raises concerns among users who have lower levels of trust and perceive higher risks associated with its use. This complex interplay between ease of use and safety concerns necessitates a careful balance, highlighting the need for robust security measures and clear, transparent communication about how AI systems work and their limitations. The study suggests several strategic approaches to enhance the adoption and integration of AI in healthcare. These include targeted training programs for healthcare professionals to increase familiarity with AI technologies, reduce perceived risks, and build trust. Ensuring transparency and conducting rigorous testing are also vital to foster trust and reliability. Moreover, comprehensive policy frameworks are needed to guide the implementation of AI technologies, ensuring high standards of patient safety, privacy, and ethical use. These measures are crucial for fostering broader acceptance of AI in healthcare, as the study contributes to enriching the discourse on AI's role by detailing how various factors affect its adoption and impact.

Keywords: ChatGPT, healthcare, survey study, IT adoption, behaviour, applcation, concerns

Procedia PDF Downloads 0
3576 Emergency Contraceptive Utilization Among Female College Student in Gondar Town, Central Gondar, Ethiopia 2023

Authors: Anbesaw Mitiku

Abstract:

Introduction: Contraceptive is a method of preventing unwanted pregnancy. Emergency contraceptives (EC) are one of the means of preventing unwanted pregnancy after unprotected sexual intercourse or failure of another contraceptive, such as condom breakage. Studies show that there was a gap in utilization and knowledge in the studies conducted in different countries. Different studies in Ethiopia indicate that awareness of EC is less than 50%, and utilization is less than 10%. Objective: This study assesses emergency contraceptive utilization among female college students of Gondar town, Northwest Ethiopia, 2023. Method: Institutional-based Cross-sectional study was designed. From April 28- May 13, 2023, the study population was all college students attending college in Gondar town, with a total of 245 sample sizes. Data was entered into Epi_ info version 7 and exported to SPSS version 20.0 for analysis. Result: Of the total respondents, 102 (41.6%) respondents who have had sex ever used EC with the intention of preventing unwanted pregnancy. Of those, 20 (8.2%) were post-pill, and 46(18.8%) were injections. The major source was pharmacies, with 51(20.8%) followed by Hospital 40(16.3%) and 23 (9.4%) of them took ECs only once time, 29 (11.8%) took two times, and 46(18.8%) started the regular method after ECs. Most of the respondents, 43 (17.6%), have discussions about ECs with friends. Conclusions and Recommendation: College students' awareness of emergency contraceptives was very low. Emergency contraceptive education should be provided for all college students and other university students. Correct timing is the single most important determinant of EC effectiveness.

Keywords: practice, emergency contraceptive, unintended, Ethiopia, unwanted

Procedia PDF Downloads 6
3575 Predicting the Impact of Scope Changes on Project Cost and Schedule Using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

In the dynamic landscape of project management, scope changes are an inevitable reality that can significantly impact project performance. These changes, whether initiated by stakeholders, external factors, or internal project dynamics, can lead to cost overruns and schedule delays. Accurately predicting the consequences of these changes is crucial for effective project control and informed decision-making. This study aims to develop predictive models to estimate the impact of scope changes on project cost and schedule using machine learning techniques. The research utilizes a comprehensive dataset containing detailed information on project tasks, including the Work Breakdown Structure (WBS), task type, productivity rate, estimated cost, actual cost, duration, task dependencies, scope change magnitude, and scope change timing. Multiple machine learning models are developed and evaluated to predict the impact of scope changes on project cost and schedule. These models include Linear Regression, Decision Tree, Ridge Regression, Random Forest, Gradient Boosting, and XGBoost. The dataset is split into training and testing sets, and the models are trained using the preprocessed data. Cross-validation techniques are employed to assess the robustness and generalization ability of the models. The performance of the models is evaluated using metrics such as Mean Squared Error (MSE) and R-squared. Residual plots are generated to assess the goodness of fit and identify any patterns or outliers. Hyperparameter tuning is performed to optimize the XGBoost model and improve its predictive accuracy. The feature importance analysis reveals the relative significance of different project attributes in predicting the impact on cost and schedule. Key factors such as productivity rate, scope change magnitude, task dependencies, estimated cost, actual cost, duration, and specific WBS elements are identified as influential predictors. The study highlights the importance of considering both cost and schedule implications when managing scope changes. The developed predictive models provide project managers with a data-driven tool to proactively assess the potential impact of scope changes on project cost and schedule. By leveraging these insights, project managers can make informed decisions, optimize resource allocation, and develop effective mitigation strategies. The findings of this research contribute to improved project planning, risk management, and overall project success.

Keywords: cost impact, machine learning, predictive modeling, schedule impact, scope changes

Procedia PDF Downloads 12
3574 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection

Authors: Mahshid Arabi

Abstract:

With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.

Keywords: data protection, digital technologies, information security, modern management

Procedia PDF Downloads 10
3573 Evaluation of the Efficiency of French Language Educational Software for Learners in Semnan Province, Iran

Authors: Alireza Hashemi

Abstract:

In recent decades, language teaching methodology has undergone significant changes due to the advent of computers and the growth of educational software. French language education has also benefited from these developments, and various software has been produced to facilitate the learning of this language. However, the question arises whether these software programs meet the educational needs of Iranian learners, particularly in Semnan Province. The aim of this study is to evaluate the efficiency and effectiveness of French language educational software for learners in Semnan Province, considering educational, cultural, and technical criteria. In this study, content analysis and performance evaluation methods were used to examine the educational software ‘Français Facile’. This software was evaluated based on criteria such as teaching methods, cultural compatibility, and technical features. To collect data, standardized questionnaires and semi-structured interviews with learners in Semnan Province were used. Additionally, the SPSS statistical software was employed for quantitative data analysis, and the thematic analysis method was used for qualitative data. The results indicated that the ‘Français Facile’ software has strengths such as providing diverse educational content and an interactive learning environment. However, some weaknesses include the lack of alignment of educational content with the learning culture of learners in Semnan Province and technical issues in software execution. Statistical data showed that 65% of learners were satisfied with the educational content, but 55% reported issues related to cultural alignment with their needs. This study indicates that to enhance the efficiency of French language educational software, there is a need to localize educational content and improve technical infrastructure. Producing locally adapted educational software can improve the quality of language learning and increase the motivation of learners in Semnan Province. This research emphasizes the importance of understanding the cultural and educational needs of learners in the development of educational software and recommends that developers of educational software pay special attention to these aspects.

Keywords: educational software, French language, Iran, learners in Semnan province

Procedia PDF Downloads 11
3572 Lung Disease Detection from the Chest X Ray Images Using Various Transfer Learning

Authors: Aicha Akrout, Amira Echtioui, Mohamed Ghorbel

Abstract:

Pneumonia remains a significant global health concern, posing a substantial threat to human lives due to its contagious nature and potentially fatal respiratory complications caused by bacteria, fungi, or viruses. The reliance on chest X-rays for diagnosis, although common, often necessitates expert interpretation, leading to delays and potential inaccuracies in treatment. This study addresses these challenges by employing transfer learning techniques to automate the detection of lung diseases, with a focus on pneumonia. Leveraging three pre-trained models, VGG-16, ResNet50V2, and MobileNetV2, we conducted comprehensive experiments to evaluate their performance. Our findings reveal that the proposed model based on VGG-16 demonstrates superior accuracy, precision, recall, and F1 score, achieving impressive results with an accuracy of 93.75%, precision of 94.50%, recall of 94.00%, and an F1 score of 93.50%. This research underscores the potential of transfer learning in enhancing pneumonia diagnosis and treatment outcomes, offering a promising avenue for improving healthcare delivery and reducing mortality rates associated with this debilitating respiratory condition.

Keywords: chest x-ray, lung diseases, transfer learning, pneumonia detection

Procedia PDF Downloads 12
3571 Cybersecurity Challenges in Africa

Authors: Chimmoe Fomo Michelle Larissa

Abstract:

The challenges of cybersecurity in Africa are increasingly significant as the continent undergoes rapid digital transformation. With the rise of internet connectivity, mobile phone usage, and digital financial services, Africa faces unique cybersecurity threats. The significance of this study lies in understanding these threats and the multifaceted challenges that hinder effective cybersecurity measures across the continent. The methodologies employed in this study include a comprehensive analysis of existing cybersecurity frameworks in various African countries, surveys of key stakeholders in the digital ecosystem, and case studies of cybersecurity incidents. These methodologies aim to provide a detailed understanding of the current cybersecurity landscape, identify gaps in existing policies, and evaluate the effectiveness of implemented security measures. Major findings of the study indicate that Africa faces numerous cybersecurity challenges, including inadequate regulatory frameworks, insufficient cybersecurity awareness, and a shortage of skilled professionals. Additionally, the prevalence of cybercrime, such as financial fraud, data breaches, and ransomware attacks, exacerbates the situation. The study also highlights the role of international cooperation and regional collaboration in addressing these challenges and improving overall cybersecurity resilience. In conclusion, addressing cybersecurity challenges in Africa requires a multifaceted approach that involves strengthening regulatory frameworks, enhancing public awareness, and investing in cybersecurity education and training. The study underscores the importance of regional and international collaboration in building a robust cybersecurity infrastructure capable of mitigating the risks associated with the continent's digital growth.

Keywords: Africa, cybersecurity, challenges, digital infrastructure, cybercrime

Procedia PDF Downloads 11
3570 Green Crypto Mining: A Quantitative Analysis of the Profitability of Bitcoin Mining Using Excess Wind Energy

Authors: John Dorrell, Matthew Ambrosia, Abilash

Abstract:

This paper employs econometric analysis to quantify the potential profit wind farms can receive by allocating excess wind energy to power bitcoin mining machines. Cryptocurrency mining consumes a substantial amount of electricity worldwide, and wind energy produces a significant amount of energy that is lost because of the intermittent nature of the resource. Supply does not always match consumer demand. By combining the weaknesses of these two technologies, we can improve efficiency and a sustainable path to mine cryptocurrencies. This paper uses historical wind energy from the ERCOT network in Texas and cryptocurrency data from 2000-2021, to create 4-year return on investment projections. Our research model incorporates the price of bitcoin, the price of the miner, the hash rate of the miner relative to the network hash rate, the block reward, the bitcoin transaction fees awarded to the miners, the mining pool fees, the cost of the electricity and the percentage of time the miner will be running to demonstrate that wind farms generate enough excess energy to mine bitcoin profitably. Excess wind energy can be used as a financial battery, which can utilize wasted electricity by changing it into economic energy. The findings of our research determine that wind energy producers can earn profit while not taking away much if any, electricity from the grid. According to our results, Bitcoin mining could give as much as 1347% and 805% return on investment with the starting dates of November 1, 2021, and November 1, 2022, respectively, using wind farm curtailment. This paper is helpful to policymakers and investors in determining efficient and sustainable ways to power our economic future. This paper proposes a practical solution for the problem of crypto mining energy consumption and creates a more sustainable energy future for Bitcoin.

Keywords: bitcoin, mining, economics, energy

Procedia PDF Downloads 13
3569 Energy Efficiency and Sustainability Analytics for Reducing Carbon Emissions in Oil Refineries

Authors: Gaurav Kumar Sinha

Abstract:

The oil refining industry, significant in its energy consumption and carbon emissions, faces increasing pressure to reduce its environmental footprint. This article explores the application of energy efficiency and sustainability analytics as crucial tools for reducing carbon emissions in oil refineries. Through a comprehensive review of current practices and technologies, this study highlights innovative analytical approaches that can significantly enhance energy efficiency. We focus on the integration of advanced data analytics, including machine learning and predictive modeling, to optimize process controls and energy use. These technologies are examined for their potential to not only lower energy consumption but also reduce greenhouse gas emissions. Additionally, the article discusses the implementation of sustainability analytics to monitor and improve environmental performance across various operational facets of oil refineries. We explore case studies where predictive analytics have successfully identified opportunities for reducing energy use and emissions, providing a template for industry-wide application. The challenges associated with deploying these analytics, such as data integration and the need for skilled personnel, are also addressed. The paper concludes with strategic recommendations for oil refineries aiming to enhance their sustainability practices through the adoption of targeted analytics. By implementing these measures, refineries can achieve significant reductions in carbon emissions, aligning with global environmental goals and regulatory requirements.

Keywords: energy efficiency, sustainability analytics, carbon emissions, oil refineries, data analytics, machine learning, predictive modeling, process optimization, greenhouse gas reduction, environmental performance

Procedia PDF Downloads 10
3568 Cybersecurity Strategies for Protecting Oil and Gas Industrial Control Systems

Authors: Gaurav Kumar Sinha

Abstract:

The oil and gas industry is a critical component of the global economy, relying heavily on industrial control systems (ICS) to manage and monitor operations. However, these systems are increasingly becoming targets for cyber-attacks, posing significant risks to operational continuity, safety, and environmental integrity. This paper explores comprehensive cybersecurity strategies for protecting oil and gas industrial control systems. It delves into the unique vulnerabilities of ICS in this sector, including outdated legacy systems, integration with IT networks, and the increased connectivity brought by the Industrial Internet of Things (IIoT). We propose a multi-layered defense approach that includes the implementation of robust network security protocols, regular system updates and patch management, advanced threat detection and response mechanisms, and stringent access control measures. We illustrate the effectiveness of these strategies in mitigating cyber risks and ensuring the resilient and secure operation of oil and gas industrial control systems. The findings underscore the necessity for a proactive and adaptive cybersecurity framework to safeguard critical infrastructure in the face of evolving cyber threats.

Keywords: cybersecurity, industrial control systems, oil and gas, cyber-attacks, network security, IoT, threat detection, system updates, patch management, access control, cybersecurity awareness, critical infrastructure, resilience, cyber threats, legacy systems, IT integration, multi-layered defense, operational continuity, safety, environmental integrity

Procedia PDF Downloads 15
3567 Blockchain Technology for Secure and Transparent Oil and Gas Supply Chain Management

Authors: Gaurav Kumar Sinha

Abstract:

The oil and gas industry, characterized by its complex and global supply chains, faces significant challenges in ensuring security, transparency, and efficiency. Blockchain technology, with its decentralized and immutable ledger, offers a transformative solution to these issues. This paper explores the application of blockchain technology in the oil and gas supply chain, highlighting its potential to enhance data security, improve transparency, and streamline operations. By leveraging smart contracts, blockchain can automate and secure transactions, reducing the risk of fraud and errors. Additionally, the integration of blockchain with IoT devices enables real-time tracking and monitoring of assets, ensuring data accuracy and integrity throughout the supply chain. Case studies and pilot projects within the industry demonstrate the practical benefits and challenges of implementing blockchain solutions. The findings suggest that blockchain technology can significantly improve trust and collaboration among supply chain participants, ultimately leading to more efficient and resilient operations. This study provides valuable insights for industry stakeholders considering the adoption of blockchain technology to address their supply chain management challenges.

Keywords: blockchain technology, oil and gas supply chain, data security, transparency, smart contracts, IoT integration, real-time tracking, asset monitoring, fraud reduction, supply chain efficiency, data integrity, case studies, industry implementation, trust, collaboration.

Procedia PDF Downloads 13
3566 AI-Driven Forecasting Models for Anticipating Oil Market Trends and Demand

Authors: Gaurav Kumar Sinha

Abstract:

The volatility of the oil market, influenced by geopolitical, economic, and environmental factors, presents significant challenges for stakeholders in predicting trends and demand. This article explores the application of artificial intelligence (AI) in developing robust forecasting models to anticipate changes in the oil market more accurately. We delve into various AI techniques, including machine learning, deep learning, and time series analysis, that have been adapted to analyze historical data and current market conditions to forecast future trends. The study evaluates the effectiveness of these models in capturing complex patterns and dependencies in market data, which traditional forecasting methods often miss. Additionally, the paper discusses the integration of external variables such as political events, economic policies, and technological advancements that influence oil prices and demand. By leveraging AI, stakeholders can achieve a more nuanced understanding of market dynamics, enabling better strategic planning and risk management. The article concludes with a discussion on the potential of AI-driven models in enhancing the predictive accuracy of oil market forecasts and their implications for global economic planning and strategic resource allocation.

Keywords: AI forecasting, oil market trends, machine learning, deep learning, time series analysis, predictive analytics, economic factors, geopolitical influence, technological advancements, strategic planning

Procedia PDF Downloads 12
3565 A Case Study on Machine Learning-Based Project Performance Forecasting for an Urban Road Reconstruction Project

Authors: Soheila Sadeghi

Abstract:

In construction projects, predicting project performance metrics accurately is essential for effective management and successful delivery. However, conventional methods often depend on fixed baseline plans, disregarding the evolving nature of project progress and external influences. To address this issue, we introduce a distinct approach based on machine learning to forecast key performance indicators, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category within an urban road reconstruction project. Our proposed model leverages time series forecasting techniques, namely Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance by analyzing historical data and project progress. Additionally, the model incorporates external factors, including weather patterns and resource availability, as features to improve forecast accuracy. By harnessing the predictive capabilities of machine learning, our performance forecasting model enables project managers to proactively identify potential deviations from the baseline plan and take timely corrective measures. To validate the effectiveness of the proposed approach, we conduct a case study on an urban road reconstruction project, comparing the model's predictions with actual project performance data. The outcomes of this research contribute to the advancement of project management practices in the construction industry by providing a data-driven solution for enhancing project performance monitoring and control.

Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, schedule variance, earned value management

Procedia PDF Downloads 15
3564 Enhancing Project Performance Forecasting using Machine Learning Techniques

Authors: Soheila Sadeghi

Abstract:

Accurate forecasting of project performance metrics is crucial for successfully managing and delivering urban road reconstruction projects. Traditional methods often rely on static baseline plans and fail to consider the dynamic nature of project progress and external factors. This research proposes a machine learning-based approach to forecast project performance metrics, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category in an urban road reconstruction project. The proposed model utilizes time series forecasting techniques, including Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance based on historical data and project progress. The model also incorporates external factors, such as weather patterns and resource availability, as features to enhance the accuracy of forecasts. By applying the predictive power of machine learning, the performance forecasting model enables proactive identification of potential deviations from the baseline plan, which allows project managers to take timely corrective actions. The research aims to validate the effectiveness of the proposed approach using a case study of an urban road reconstruction project, comparing the model's forecasts with actual project performance data. The findings of this research contribute to the advancement of project management practices in the construction industry, offering a data-driven solution for improving project performance monitoring and control.

Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, earned value management

Procedia PDF Downloads 17
3563 Generating High-Frequency Risk Factor Collections with Transformer

Authors: Wenyan Xu, Rundong Wang, Chen Li, Yonghong Hu, Zhonghua Lu

Abstract:

In the field of quantitative trading, it is common to find patterns in short-term volatile trends of the market. These patterns are known as High-Frequency (HF) risk factors, serving as effective indicators of future stock price volatility. However, in the past, these risk factors were usually generated by traditional financial models, and the validity of these risk factors is heavily based on domain-specific knowledge manually added instead of extensive market data. Inspired by symbolic regression (SR), the task of inferring mathematical laws from existing data, we take the extraction of formulaic risk factors from high-frequency trading (HFT) market data as an SR task. In this paper, we challenge the procedure of manually constructing risk factors and propose an end-to-end methodology Intraday Risk Factor Transformer (IRFT), to directly predict the full formulaic factors, constants included. Specifically, we utilize a hybrid symbolic-numeric vocabulary where symbolic tokens denote operators/stock features and numeric tokens denote constants. Then, we train a Transformer model on the HFT dataset to directly generate complete formulaic HF risk factors without relying on the skeleton, which is a parametric function using a pre-defined list of operators – typically, the math operations (+, ×, /) and functions (√x, log x, cos x). It determines the general shape of the stock volatility law up to a choice of constants, e.g., f(x) = tan(ax+b) (x is the stock price). We further refine predicted constants(a,b) using the Broyden–Fletcher–Goldfarb–Shanno algorithm (BFGS) as informed guesses to mitigate non-linear issues. Compared to the 10 approaches in SRBench, which is a living benchmark for SR, IRFT gains a 30% excess investment return on the HS300 and S&P500 dataset, with inference times orders of magnitude faster than theirs in HF risk factor mining tasks.

Keywords: transformer, factor-mining language model, highfrequency risk factor collections

Procedia PDF Downloads 3
3562 Enhancing Hyperledger Fabric: A Scalable Framework for Optimized Blockchain Performance

Authors: Ankan Saha, Sourav Majumder, Md. Motaleb Hossen Manik, M. M. A. Hashem

Abstract:

Hyperledger Fabric (HF), one of the private blockchain architectures, has gained popularity for enterprise use cases, namely supply chain management, finance, healthcare, etc., while focusing on the demand of users and functionalities like privacy, scalability, throughput, and modular architecture. However, enhancing performance is a crucial focus in the everchanging field of blockchain technology, particularly for private blockchains like HF. This paper focuses on the inherent difficulties related to scalability, throughput, and efficiency in handling large transactions. Our framework establishes a solid network architecture with two organizations, each having two types of peers (i.e., endorsing and anchor peers) and three raft orderers. It brings innovation to the chaincode, addresses functionalities like registration and transaction management via CouchDB, and integrates transaction management and block retrieval. Additionally, it includes a distributed consensus mechanism to gain maximum performance in a large architecture. Eventually, the findings assert an apparent enhancement in scalability, transaction speed, and system responsiveness, highlighting the effectiveness of our framework in optimizing the HF architecture.

Keywords: hyperledger fabric, private blockchain, scalability, transaction throughput, latency, consensus mechanism

Procedia PDF Downloads 10
3561 The Role of Cryptocurrency in Facilitating Cross-Border Payments: Case Study Bangladesh

Authors: Mohammad Abdul Matin

Abstract:

The use of cryptocurrency in cross-border transactions has gained significant attention due to its potential to increase efficiency and reduce costs. This paper aims to explore the role of cryptocurrency in facilitating cross-border payments, with a focus on the case of Bangladesh, where millions of Bangladeshi nationals reside abroad. The research will investigate the current cross-border payment landscape in Bangladesh and analyze the potential benefits and challenges of using cryptocurrency for remittances. Furthermore, the study will assess the regulatory environment and the adoption of cryptocurrency in Bangladesh, considering its impact on the broader financial system and economy.

Keywords: cross-border payment, cryptocurrency, regulations benefits, benefits

Procedia PDF Downloads 12
3560 Identification of Rice Quality Using Gas Sensors and Neural Networks

Authors: Moh Hanif Mubarok, Muhammad Rivai

Abstract:

The public's response to quality rice is very high. So it is necessary to set minimum standards in checking the quality of rice. Most rice quality measurements still use manual methods, which are prone to errors due to limited human vision and the subjectivity of testers. So, a gas detection system can be a solution that has high effectiveness and subjectivity for solving current problems. The use of gas sensors in testing rice quality must pay attention to several parameters. The parameters measured in this research are the percentage of rice water content, gas concentration, output voltage, and measurement time. Therefore, this research was carried out to identify carbon dioxide (CO₂), nitrous oxide (N₂O) and methane (CH₄) gases in rice quality using a series of gas sensors using the Neural Network method.

Keywords: carbon dioxide, dinitrogen oxide, methane, semiconductor gas sensor, neural network

Procedia PDF Downloads 15
3559 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier

Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh

Abstract:

This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.

Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems

Procedia PDF Downloads 15
3558 Roles of Tester in Automated World

Authors: Sagar Mahendrakar

Abstract:

Testers' roles have changed dramatically as automation continues to revolutionise the software development lifecycle. There's a general belief that manual testing is becoming outdated with the introduction of advanced testing frameworks and tools. This abstract, however, disproves that notion by examining the complex and dynamic role that testers play in automated environments. In this work, we explore the complex duties that testers have when everything is automated. We contend that although automation increases productivity and simplifies monotonous tasks, it cannot completely replace the cognitive abilities and subject-matter knowledge of human testers. Rather, testers shift their focus to higher-value tasks like creating test strategies, designing test cases, and delving into intricate scenarios that are difficult to automate. We also emphasise the critical role that testers play in guaranteeing the precision, thoroughness, and dependability of automated testing. Testers verify the efficacy of automated scripts and pinpoint areas for improvement through rigorous test planning, execution, and result analysis. They play the role of quality defenders, using their analytical and problem-solving abilities to find minute flaws that computerised tests might miss. Furthermore, the abstract emphasises how testing in automated environments is a collaborative process. In order to match testing efforts with business objectives, improve test automation frameworks, and rank testing tasks according to risk, testers work closely with developers, automation engineers, and other stakeholders. Finally, we discuss how testers in the era of automation need to possess a growing skill set. To stay current, testers need to develop skills in scripting languages, test automation tools, and emerging technologies in addition to traditional testing competencies. Soft skills like teamwork, communication, and flexibility are also essential for productive cooperation in cross-functional teams. This abstract clarifies the ongoing importance of testers in automated settings. Testers can use automation to improve software quality and provide outstanding user experiences by accepting their changing role as strategic partners and advocates for quality.

Keywords: testing, QA, automation, leadership

Procedia PDF Downloads 16
3557 Times2D: A Time-Frequency Method for Time Series Forecasting

Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan

Abstract:

Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.

Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation

Procedia PDF Downloads 21
3556 Facial Recognition of University Entrance Exam Candidates using FaceMatch Software in Iran

Authors: Mahshid Arabi

Abstract:

In recent years, remarkable advancements in the fields of artificial intelligence and machine learning have led to the development of facial recognition technologies. These technologies are now employed in a wide range of applications, including security, surveillance, healthcare, and education. In the field of education, the identification of university entrance exam candidates has been one of the fundamental challenges. Traditional methods such as using ID cards and handwritten signatures are not only inefficient and prone to fraud but also susceptible to errors. In this context, utilizing advanced technologies like facial recognition can be an effective and efficient solution to increase the accuracy and reliability of identity verification in entrance exams. This article examines the use of FaceMatch software for recognizing the faces of university entrance exam candidates in Iran. The main objective of this research is to evaluate the efficiency and accuracy of FaceMatch software in identifying university entrance exam candidates to prevent fraud and ensure the authenticity of individuals' identities. Additionally, this research investigates the advantages and challenges of using this technology in Iran's educational systems. This research was conducted using an experimental method and random sampling. In this study, 1000 university entrance exam candidates in Iran were selected as samples. The facial images of these candidates were processed and analyzed using FaceMatch software. The software's accuracy and efficiency were evaluated using various metrics, including accuracy rate, error rate, and processing time. The research results indicated that FaceMatch software could accurately identify candidates with a precision of 98.5%. The software's error rate was less than 1.5%, demonstrating its high efficiency in facial recognition. Additionally, the average processing time for each candidate's image was less than 2 seconds, indicating the software's high efficiency. Statistical evaluation of the results using precise statistical tests, including analysis of variance (ANOVA) and t-test, showed that the observed differences were significant, and the software's accuracy in identity verification is high. The findings of this research suggest that FaceMatch software can be effectively used as a tool for identifying university entrance exam candidates in Iran. This technology not only enhances security and prevents fraud but also simplifies and streamlines the exam administration process. However, challenges such as preserving candidates' privacy and the costs of implementation must also be considered. The use of facial recognition technology with FaceMatch software in Iran's educational systems can be an effective solution for preventing fraud and ensuring the authenticity of university entrance exam candidates' identities. Given the promising results of this research, it is recommended that this technology be more widely implemented and utilized in the country's educational systems.

Keywords: facial recognition, FaceMatch software, Iran, university entrance exam

Procedia PDF Downloads 20
3555 Exploring Cybersecurity and Phishing Attacks within Healthcare Institutions in Saudi Arabia: A Narrative Review

Authors: Ebtesam Shadadi, Rasha Ibrahim, Essam Ghadafi

Abstract:

Phishing poses a significant threat as a cybercrime by tricking end users into revealing their confidential and sensitive information. Attackers often manipulate victims to achieve their malicious goals. The increasing prevalence of Phishing has led to extensive research on this issue, including studies focusing on phishing attempts in healthcare institutions in the Kingdom of Saudi Arabia. This paper explores the importance of analyzing phishing attacks, specifically focusing on those targeting the healthcare industry. The study delves into the tactics, obstacles, and remedies associated with these attacks, all while considering the implications for Saudi Vision 2030.

Keywords: phishing, cybersecurity, cyber threat, social engineering, vision 2030

Procedia PDF Downloads 14
3554 Instance Selection for MI-Support Vector Machines

Authors: Amy M. Kwon

Abstract:

Support vector machine (SVM) is a well-known algorithm in machine learning due to its superior performance, and it also functions well in multiple-instance (MI) problems. Our study proposes a schematic algorithm to select instances based on Hausdorff distance, which can be adapted to SVMs as input vectors under the MI setting. Based on experiments on five benchmark datasets, our strategy for adapting representation outperformed in comparison with original approach. In addition, task execution times (TETs) were reduced by more than 80% based on MissSVM. Hence, it is noteworthy to consider this representation adaptation to SVMs under MI-setting.

Keywords: support vector machine, Margin, Hausdorff distance, representation selection, multiple-instance learning, machine learning

Procedia PDF Downloads 13
3553 Development of an Automatic Monitoring System Based on the Open Architecture Concept

Authors: Andrii Biloshchytskyi, Serik Omirbayev, Alexandr Neftissov, Sapar Toxanov, Svitlana Biloshchytska, Adil Faizullin

Abstract:

Kazakhstan has adopted a carbon neutrality strategy until 2060. In accordance with this strategy, it is necessary to introduce various tools to maintain the environmental safety of the environment. The use of IoT, in combination with the characteristics and requirements of Kazakhstan's environmental legislation, makes it possible to develop a modern environmental monitoring system. The article proposes a solution for developing an example of an automated system for the continuous collection of data on the concentration of pollutants in the atmosphere based on an open architecture. The Audino-based device acts as a microcontroller. It should be noted that the transmission of measured values is carried out via an open wireless communication protocol. The architecture of the system, which was used to build a prototype based on sensors, an Arduino microcontroller, and a wireless data transmission module, is presented. The selection of elementary components may change depending on the requirements of the system; the introduction of new units is limited by the number of ports. The openness of solutions allows you to change the configuration depending on the conditions. The advantages of the solutions are openness, low cost, versatility and mobility. However, there is no comparison of the working processes of the proposed solution with traditional ones.

Keywords: environmental monitoring, greenhouse gases emissions, environmental pollution, Industry 4.0, IoT, microcontroller, automated monitoring system.

Procedia PDF Downloads 17
3552 Modeling Pronunciations of Arab Broca’s Aphasics Using Mosstalk Words Technique

Authors: Sadeq Al Yaari, Fayza Alhammadi, Ayman Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Adham Al Yaari, Sajedah Al Yaari, Saleh Al Yami

Abstract:

Background: There has been a debate in the literature over the years as to whether or not MossTalk Words program fits Arab Broca’s aphasics (BAs) due to that language differences and also the fact that the technique has not yet been used for aphasics with semantic dementia (SD aphasics). Aims: To oversimplify the above mentioned debate slightly for purposes of exposition, the purpose of the present study is to investigate the “usability” of this program as well as pictures and community as therapeutic techniques for both Arab BAs and SD aphasics. Method: The subjects of this study are two Saudi aphasics (53 and 57 years old, respectively). The former suffers from Broca’s aphasia due to a stroke, while the latter suffers from semantic dementia. Both aphasics can speak English and have used the Moss Talk Words program in addition to intensive picture-naming therapeutic sessions for two years. They were tested by one of the researchers four times (a time per six months). The families of the two subjects, in addition to their relatives and friends, played a major part in all therapeutic sessions. Conclusion: Results show that in averages across the entire therapeutic sessions, MossTalk Words program was clearly found more effective in modeling BAs’ pronunciation than that of SD aphasic. Furthermore, picture-naming intensive exercises in addition to the positive role of the community members played a major role in the progress of the two subjects’ performance.

Keywords: moss talk words, program, technique, Broca’s aphasia, semantic dementia, subjects, picture, community

Procedia PDF Downloads 19
3551 The Impact of Bitcoin and Cryptocurrency on the Development of Community

Authors: Felib Ayman Shawky Salem

Abstract:

Nowadays crypto currency has become a global phenomenon known to most people. People using this alternative digital money to do a transaction in many ways (e.g. Used for online shopping, wealth management, and fundraising). However, this digital asset also widely used in criminal activities since its use decentralized control as opposed to centralized electronic money and central banking systems and this makes a user, who used this currency invisible. The high-value exchange of these digital currencies also has been a target to criminal activities. The crypto currency crimes have become a challenge for the law enforcement to analyze and to proof the evidence as criminal devices. In this paper, our focus is more on bitcoin crypto currency and the possible artifacts that can be obtained from the different type of digital wallet, which is software and browser-based application. The process memory and physical hard disk are examined with the aims of identifying and recovering potential digital evidence. The stage of data acquisition divided by three states which are the initial creation of the wallet, transaction that consists transfer and receiving a coin and the last state is after the wallet is being deleted. Findings from this study suggest that both data from software and browser type of wallet process memory is a valuable source of evidence, and many of the artifacts found in process memory are also available from the application and wallet files on the client computer storage.

Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriationBitCoin, financial protection, crypto currency, money laundering cryptocurrency, digital wallet, digital forensics

Procedia PDF Downloads 12