Search results for: information systems success model
31410 Enhancing Robustness in Federated Learning through Decentralized Oracle Consensus and Adaptive Evaluation
Authors: Peiming Li
Abstract:
This paper presents an innovative blockchain-based approach to enhance the reliability and efficiency of federated learning systems. By integrating a decentralized oracle consensus mechanism into the federated learning framework, we address key challenges of data and model integrity. Our approach utilizes a network of redundant oracles, functioning as independent validators within an epoch-based training system in the federated learning model. In federated learning, data is decentralized, residing on various participants' devices. This scenario often leads to concerns about data integrity and model quality. Our solution employs blockchain technology to establish a transparent and tamper-proof environment, ensuring secure data sharing and aggregation. The decentralized oracles, a concept borrowed from blockchain systems, act as unbiased validators. They assess the contributions of each participant using a Hidden Markov Model (HMM), which is crucial for evaluating the consistency of participant inputs and safeguarding against model poisoning and malicious activities. Our methodology's distinct feature is its epoch-based training. An epoch here refers to a specific training phase where data is updated and assessed for quality and relevance. The redundant oracles work in concert to validate data updates during these epochs, enhancing the system's resilience to security threats and data corruption. The effectiveness of this system was tested using the Mnist dataset, a standard in machine learning for benchmarking. Results demonstrate that our blockchain-oriented federated learning approach significantly boosts system resilience, addressing the common challenges of federated environments. This paper aims to make these advanced concepts accessible, even to those with a limited background in blockchain or federated learning. We provide a foundational understanding of how blockchain technology can revolutionize data integrity in decentralized systems and explain the role of oracles in maintaining model accuracy and reliability.Keywords: federated learning system, block chain, decentralized oracles, hidden markov model
Procedia PDF Downloads 6331409 Random Access in IoT Using Naïve Bayes Classification
Authors: Alhusein Almahjoub, Dongyu Qiu
Abstract:
This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation
Procedia PDF Downloads 14531408 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis
Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera
Abstract:
Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.Keywords: log-linear model, multi spectral, residuals, spatial error model
Procedia PDF Downloads 29731407 Interrogation of the Role of First Year Student Experiences in Student Success at a University of Technology in South Africa
Authors: Livingstone Makondo
Abstract:
This ongoing research explores what could be the components of a comprehensive First-Year Student Experience (FYSE) at the Durban University of Technology (DUT) and the preferred implementation modalities. In light of the Siyaphumelela project, this interrogation is premised on the need to glean data for the institution that could be used to ascertain the role of FYSE towards enhancing student success. The research proceeds by examining prevalent models from other South African Universities and beyond in its quest to get at pragmatic comprehensive FYSE programme for DUT. As DUT is a student centered institution and amidst the ever shrinking economy, this research would aid higher education practitioners to ascertain if the hard earned finances are being channelled to a worthy academic venture. This research seeks to get inputs from a) students who participated in FYSE and are now in second and third years at DUT b) students who are currently participating in FYSE c) former and present Tutors d) departmental coordinators e) academics and support staff working with the participating students. This exploratory approach is preferred since 2010 DUT has grappled with how to implement an integrated institution-wide FYSE. This findings of this research could provide the much-needed data to ascertain if the current FYSE package is pivotal towards attainment of DUT Strategic Focus Area 1: Building sustainable student communities of living and learning. The ideal is to have DUT FYSE programme become an institution-wide programme that lays the foundation for consolidated and focused student development programmes for subsequent undergraduate and postgraduate levels of study. Also, armed with data from this research, DUT could develop the capacity and systems to ensure that all students get diverse on-time support to enhance their retention and academic success in their tertiary studies. In essence, the preferred FYSE curriculum woven around DUT graduate attributes should contribute towards the reduction in the first-year students’ dropout rates and subsequently in undergraduate studies. Therefore, this on-going research will feed into Siyaphumelela project and would help position 2018-2020 FYSE initiatives at DUT.Keywords: challenges, comprehensive, dropout, transition
Procedia PDF Downloads 16131406 A Comparative Analysis of Classification Models with Wrapper-Based Feature Selection for Predicting Student Academic Performance
Authors: Abdullah Al Farwan, Ya Zhang
Abstract:
In today’s educational arena, it is critical to understand educational data and be able to evaluate important aspects, particularly data on student achievement. Educational Data Mining (EDM) is a research area that focusing on uncovering patterns and information in data from educational institutions. Teachers, if they are able to predict their students' class performance, can use this information to improve their teaching abilities. It has evolved into valuable knowledge that can be used for a wide range of objectives; for example, a strategic plan can be used to generate high-quality education. Based on previous data, this paper recommends employing data mining techniques to forecast students' final grades. In this study, five data mining methods, Decision Tree, JRip, Naive Bayes, Multi-layer Perceptron, and Random Forest with wrapper feature selection, were used on two datasets relating to Portuguese language and mathematics classes lessons. The results showed the effectiveness of using data mining learning methodologies in predicting student academic success. The classification accuracy achieved with selected algorithms lies in the range of 80-94%. Among all the selected classification algorithms, the lowest accuracy is achieved by the Multi-layer Perceptron algorithm, which is close to 70.45%, and the highest accuracy is achieved by the Random Forest algorithm, which is close to 94.10%. This proposed work can assist educational administrators to identify poor performing students at an early stage and perhaps implement motivational interventions to improve their academic success and prevent educational dropout.Keywords: classification algorithms, decision tree, feature selection, multi-layer perceptron, Naïve Bayes, random forest, students’ academic performance
Procedia PDF Downloads 16631405 Further Investigation of α+12C and α+16O Elastic Scattering
Authors: Sh. Hamada
Abstract:
The current work aims to study the rainbow like-structure observed in the elastic scattering of alpha particles on both 12C and 16O nuclei. We reanalyzed the experimental elastic scattering angular distributions data for α+12C and α+16O nuclear systems at different energies using both optical model and double folding potential of different interaction models such as: CDM3Y1, DDM3Y1, CDM3Y6 and BDM3Y1. Potential created by BDM3Y1 interaction model has the shallowest depth which reflects the necessity to use higher renormalization factor (Nr). Both optical model and double folding potential of different interaction models fairly reproduce the experimental data.Keywords: density distribution, double folding, elastic scattering, nuclear rainbow, optical model
Procedia PDF Downloads 23731404 Artificial Neural Networks and Geographic Information Systems for Coastal Erosion Prediction
Authors: Angeliki Peponi, Paulo Morgado, Jorge Trindade
Abstract:
Artificial Neural Networks (ANNs) and Geographic Information Systems (GIS) are applied as a robust tool for modeling and forecasting the erosion changes in Costa Caparica, Lisbon, Portugal, for 2021. ANNs present noteworthy advantages compared with other methods used for prediction and decision making in urban coastal areas. Multilayer perceptron type of ANNs was used. Sensitivity analysis was conducted on natural and social forces and dynamic relations in the dune-beach system of the study area. Variations in network’s parameters were performed in order to select the optimum topology of the network. The developed methodology appears fitted to reality; however further steps would make it better suited.Keywords: artificial neural networks, backpropagation, coastal urban zones, erosion prediction
Procedia PDF Downloads 39231403 Assessment Methodology of E-government Projects for the Regions of Georgia
Authors: Tina Melkoshvili
Abstract:
Drastic development of information and communication technologies in Georgia has led to the necessity of launching conceptually new, effective, flexible, transparent and society oriented form of government that is e-government. Through applying information technologies, the electronic system enables to raise the efficacy of state governance and increase citizens’ participation in the process. Focusing on the topic of e-government allows us to analyze success stories, attributed benefits and, at the same time, observes challenges hampering the government development process. There are number of methodologies elaborated to study the conditions in the field of electronic governance. They enable us to find out if the government is ready to apply broad opportunities of information and communication technologies and if the government is apt to improve the accessibility and quality of delivering mainly social services. This article seeks to provide comparative analysis of widely spread methodologies used for Electronic government projects’ assessment. It has been concluded that applying current methods of assessment in Georgia is related to difficulties due to inaccessible data and the necessity of involving number of experts. The article presents new indicators for e-government development assessment that reflect efficacy of e-government conception realization in the regions of Georgia and enables to provide quantitative evaluation of regional e-government projects including all significant aspects of development.Keywords: development methodology, e-government in Georgia, information and communication technologies, regional government
Procedia PDF Downloads 27531402 Personal Information Classification Based on Deep Learning in Automatic Form Filling System
Authors: Shunzuo Wu, Xudong Luo, Yuanxiu Liao
Abstract:
Recently, the rapid development of deep learning makes artificial intelligence (AI) penetrate into many fields, replacing manual work there. In particular, AI systems also become a research focus in the field of automatic office. To meet real needs in automatic officiating, in this paper we develop an automatic form filling system. Specifically, it uses two classical neural network models and several word embedding models to classify various relevant information elicited from the Internet. When training the neural network models, we use less noisy and balanced data for training. We conduct a series of experiments to test my systems and the results show that our system can achieve better classification results.Keywords: artificial intelligence and office, NLP, deep learning, text classification
Procedia PDF Downloads 20031401 Controlling the Expense of Political Contests Using a Modified N-Players Tullock’s Model
Abstract:
This work introduces a generalization of the classical Tullock’s model of one-stage contests under complete information with multiple unlimited numbers of contestants. In classical Tullock’s model, the contest winner is not necessarily the highest bidder. Instead, the winner is determined according to a draw in which the winning probabilities are the relative contestants’ efforts. The Tullock modeling fits well political contests, in which the winner is not necessarily the highest effort contestant. This work presents a modified model which uses a simple non-discriminating rule, namely, a parameter to influence the total costs planned for an election, for example, the contest designer can control the contestants' efforts. The winner pays a fee, and the losers are reimbursed the same amount. Our proposed model includes a mechanism that controls the efforts exerted and balances competition, creating a tighter, less predictable and more interesting contest. Additionally, the proposed model follows the fairness criterion in the sense that it does not alter the contestants' probabilities of winning compared to the classic Tullock’s model. We provide an analytic solution for the contestant's optimal effort and expected reward.Keywords: contests, Tullock's model, political elections, control expenses
Procedia PDF Downloads 14531400 A Literature Review on the Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster
Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon
Abstract:
In a disaster event, sharing patient information between the pre-hospitals Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre-EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors which are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality and the data were analysed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system which can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analysed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospitals staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.Keywords: communication, emergency communication services, emergency medical teams, emergency physicians, emergency nursing, paramedics, information and communication technology, communication systems
Procedia PDF Downloads 8631399 The Effectiveness of a Hybrid Diffie-Hellman-RSA-Advanced Encryption Standard Model
Authors: Abdellahi Cheikh
Abstract:
With the emergence of quantum computers with very powerful capabilities, the security of the exchange of shared keys between two interlocutors poses a big problem in terms of the rapid development of technologies such as computing power and computing speed. Therefore, the Diffie-Hellmann (DH) algorithm is more vulnerable than ever. No mechanism guarantees the security of the key exchange, so if an intermediary manages to intercept it, it is easy to intercept. In this regard, several studies have been conducted to improve the security of key exchange between two interlocutors, which has led to interesting results. The modification made on our model Diffie-Hellman-RSA-AES (DRA), which encrypts the information exchanged between two users using the three-encryption algorithms DH, RSA and AES, by using stenographic photos to hide the contents of the p, g and ClesAES values that are sent in an unencrypted state at the level of DRA model to calculate each user's public key. This work includes a comparative study between the DRA model and all existing solutions, as well as the modification made to this model, with an emphasis on the aspect of reliability in terms of security. This study presents a simulation to demonstrate the effectiveness of the modification made to the DRA model. The obtained results show that our model has a security advantage over the existing solution, so we made these changes to reinforce the security of the DRA model.Keywords: Diffie-Hellmann, DRA, RSA, advanced encryption standard
Procedia PDF Downloads 9331398 Developing Integrated Model for Building Design and Evacuation Planning
Authors: Hao-Hsi Tseng, Hsin-Yun Lee
Abstract:
In the process of building design, the designers have to complete the spatial design and consider the evacuation performance at the same time. It is usually difficult to combine the two planning processes and it results in the gap between spatial design and evacuation performance. Then the designers cannot complete an integrated optimal design solution. In addition, the evacuation routing models proposed by previous researchers is different from the practical evacuation decisions in the real field. On the other hand, more and more building design projects are executed by Building Information Modeling (BIM) in which the design content is formed by the object-oriented framework. Thus, the integration of BIM and evacuation simulation can make a significant contribution for designers. Therefore, this research plan will establish a model that integrates spatial design and evacuation planning. The proposed model will provide the support for the spatial design modifications and optimize the evacuation planning. The designers can complete the integrated design solution in BIM. Besides, this research plan improves the evacuation routing method to make the simulation results more practical. The proposed model will be applied in a building design project for evaluation and validation when it will provide the near-optimal design suggestion. By applying the proposed model, the integration and efficiency of the design process are improved and the evacuation plan is more useful. The quality of building spatial design will be better.Keywords: building information modeling, evacuation, design, floor plan
Procedia PDF Downloads 45631397 Modeling Driving Distraction Considering Psychological-Physical Constraints
Authors: Yixin Zhu, Lishengsa Yue, Jian Sun, Lanyue Tang
Abstract:
Modeling driving distraction in microscopic traffic simulation is crucial for enhancing simulation accuracy. Current driving distraction models are mainly derived from physical motion constraints under distracted states, in which distraction-related error terms are added to existing microscopic driver models. However, the model accuracy is not very satisfying, due to a lack of modeling the cognitive mechanism underlying the distraction. This study models driving distraction based on the Queueing Network Human Processor model (QN-MHP). This study utilizes the queuing structure of the model to perform task invocation and switching for distracted operation and control of the vehicle under driver distraction. Based on the assumption of the QN-MHP model about the cognitive sub-network, server F is a structural bottleneck. The latter information must wait for the previous information to leave server F before it can be processed in server F. Therefore, the waiting time for task switching needs to be calculated. Since the QN-MHP model has different information processing paths for auditory information and visual information, this study divides driving distraction into two types: auditory distraction and visual distraction. For visual distraction, both the visual distraction task and the driving task need to go through the visual perception sub-network, and the stimuli of the two are asynchronous, which is called stimulus on asynchrony (SOA), so when calculating the waiting time for switching tasks, it is necessary to consider it. In the case of auditory distraction, the auditory distraction task and the driving task do not need to compete for the server resources of the perceptual sub-network, and their stimuli can be synchronized without considering the time difference in receiving the stimuli. According to the Theory of Planned Behavior for drivers (TPB), this study uses risk entropy as the decision criterion for driver task switching. A logistic regression model is used with risk entropy as the independent variable to determine whether the driver performs a distraction task, to explain the relationship between perceived risk and distraction. Furthermore, to model a driver’s perception characteristics, a neurophysiological model of visual distraction tasks is incorporated into the QN-MHP, and executes the classical Intelligent Driver Model. The proposed driving distraction model integrates the psychological cognitive process of a driver with the physical motion characteristics, resulting in both high accuracy and interpretability. This paper uses 773 segments of distracted car-following in Shanghai Naturalistic Driving Study data (SH-NDS) to classify the patterns of distracted behavior on different road facilities and obtains three types of distraction patterns: numbness, delay, and aggressiveness. The model was calibrated and verified by simulation. The results indicate that the model can effectively simulate the distracted car-following behavior of different patterns on various roadway facilities, and its performance is better than the traditional IDM model with distraction-related error terms. The proposed model overcomes the limitations of physical-constraints-based models in replicating dangerous driving behaviors, and internal characteristics of an individual. Moreover, the model is demonstrated to effectively generate more dangerous distracted driving scenarios, which can be used to construct high-value automated driving test scenarios.Keywords: computational cognitive model, driving distraction, microscopic traffic simulation, psychological-physical constraints
Procedia PDF Downloads 9131396 A System for Visual Management of Research Resources Focusing on Accumulation of Polish Processes
Authors: H. Anzai, H. Nakayama, H. Kaminaga, Y. Morimoto, Y. Miyadera, S. Nakamura
Abstract:
Various research resources such as papers and presentation slides are handled in the process of research activities. It is extremely important for smooth progress of the research to skillfully manage those research resources and utilize them for further investigations. However, number of the research resources increases more and more. Moreover, there are the differences in usage of each kind of research resource and their accumulation styles. So, it is actually difficult to satisfactorily manage and use the accumulated research resources. Therefore, a lack of tidiness of the resources causes the problems such as an oversight of the problem to be polish. Although there have existed research projects on support for management of research resources and for sharing of know-how, almost existing systems have not been effective enough since these systems have not sufficiently considered the polish process. This paper mainly describes a system that enables the strategic management of research resources together with polish process and their practical use.Keywords: research resource, polish process, information sharing, knowledge management, information visualization
Procedia PDF Downloads 38931395 Virtual Team Performance: A Transactive Memory System Perspective
Authors: Belbaly Nassim
Abstract:
Virtual teams (VT) initiatives, in which teams are geographically dispersed and communicate via modern computer-driven technologies, have attracted increasing attention from researchers and professionals. The growing need to examine how to balance and optimize VT is particularly important given the exposure experienced by companies when their employees encounter globalization and decentralization pressures to monitor VT performance. Hence, organization is regularly limited due to misalignment between the behavioral capabilities of the team’s dispersed competences and knowledge capabilities and how trust issues interplay and influence these VT dimensions and the effects of such exchanges. In fact, the future success of business depends on the extent to which VTs are managing efficiently their dispersed expertise, skills and knowledge to stimulate VT creativity. Transactive memory system (TMS) may enhance VT creativity using its three dimensons: knowledge specialization, credibility and knowledge coordination. TMS can be understood as a composition of both a structural component residing of individual knowledge and a set of communication processes among individuals. The individual knowledge is shared while being retrieved, applied and the learning is coordinated. TMS is driven by the central concept that the system is built on the distinction between internal and external memory encoding. A VT learns something new and catalogs it in memory for future retrieval and use. TMS uses the role of information technology to explain VT behaviors by offering VT members the possibility to encode, store, and retrieve information. TMS considers the members of a team as a processing system in which the location of expertise both enhances knowledge coordination and builds trust among members over time. We build on TMS dimensions to hypothesize the effects of specialization, coordination, and credibility on VT creativity. In fact, VTs consist of dispersed expertise, skills and knowledge that can positively enhance coordination and collaboration. Ultimately, this team composition may lead to recognition of both who has expertise and where that expertise is located; over time, the team composition may also build trust among VT members over time developing the ability to coordinate their knowledge which can stimulate creativity. We also assess the reciprocal relationship between TMS dimensions and VT creativity. We wish to use TMS to provide researchers with a theoretically driven model that is empirically validated through survey evidence. We propose that TMS provides a new way to enhance and balance VT creativity. This study also provides researchers insight into the use of TMS to influence positively VT creativity. In addition to our research contributions, we provide several managerial insights into how TMS components can be used to increase performance within dispersed VTs.Keywords: virtual team creativity, transactive memory systems, specialization, credibility, coordination
Procedia PDF Downloads 17431394 Mycophenolate Versus Methotrexate in Non-Infectious Ocular Inflammatory Disease: A Systematic Review and Meta-Analysis
Authors: Mohammad Karam, Abdulmalik Alsaif, Abdulrahman Al-Naseem, Amrit Hayre, Abdurrahman Al Jabbouri, Ahmad Aldubaikhi, Narvair Kahlar, Salem Al-Mutairi
Abstract:
Purpose: To compare the outcomes of mycophenolate mofetil (MMF) versus methotrexate (MTX) in non-infectious ocular inflammatory disease (NIOID). Methods: A systematic review and meta-analysis were performed as per the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) Guidelines and an electronic search was conducted to identify all comparative studies of MMF versus MTX in NIOID. Treatment results and side effects were primary outcome measures. Secondary outcome measures included visual acuity and resolution of macular oedema. Fixed and random-effects models were used for the analysis. Results: Four studies enrolling 905 patients were identified. There was no significant difference between MMF and MTX groups in overall treatment success (Odds Ratio [OR] = 0.97, P = 0.96) and failure (OR = 0.86, P = 0.85) of NIOID. Although treatment success of uveitis showed no significant difference for anterior and intermediate uveitis cases (OR = 2.33, P = 0.14), MTX showed a significantly improved effect in cases involving posterior uveitis and panuveitis (OR = 0.41, P = 0.003). However, the median dose required for treatment success was lower for MTX whereas MMF was associated with a faster median time to treatment success. Further to this, MMF showed a reduced rate of side effects when compared to MTX, but MTX failed to reach statistical significance, most notably for liver enzyme elevation (OR = 0.65, P = 0.16), fatigue (OR = 0.84, P = 0.49) and headache (OR = 0.81, P = 0.37). For secondary outcomes, no significant difference was noted in visual acuity and resolution of macular edema. Conclusions: MMF is comparable to MTX in the treatment of NIOID as there was no significant difference in the outcome of treatment success and side effect profiles.Keywords: Mycophenolate mofetil, methotrexate, non-infectious ocular inflammation, uveitis, scleritis
Procedia PDF Downloads 15231393 DBN-Based Face Recognition System Using Light Field
Authors: Bing Gu
Abstract:
Abstract—Most of Conventional facial recognition systems are based on image features, such as LBP, SIFT. Recently some DBN-based 2D facial recognition systems have been proposed. However, we find there are few DBN-based 3D facial recognition system and relative researches. 3D facial images include all the individual biometric information. We can use these information to build more accurate features, So we present our DBN-based face recognition system using Light Field. We can see Light Field as another presentation of 3D image, and Light Field Camera show us a way to receive a Light Field. We use the commercially available Light Field Camera to act as the collector of our face recognition system, and the system receive a state-of-art performance as convenient as conventional 2D face recognition system.Keywords: DBN, face recognition, light field, Lytro
Procedia PDF Downloads 46431392 CSoS-STRE: A Combat System-of-System Space-Time Resilience Enhancement Framework
Authors: Jiuyao Jiang, Jiahao Liu, Jichao Li, Kewei Yang, Minghao Li, Bingfeng Ge
Abstract:
Modern warfare has transitioned from the paradigm of isolated combat forces to system-to-system confrontations due to advancements in combat technologies and application concepts. A combat system-of-systems (CSoS) is a combat network composed of independently operating entities that interact with one another to provide overall operational capabilities. Enhancing the resilience of CSoS is garnering increasing attention due to its significant practical value in optimizing network architectures, improving network security and refining operational planning. Accordingly, a unified framework called CSoS space-time resilience enhancement (CSoS-STRE) has been proposed, which enhances the resilience of CSoS by incorporating spatial features. Firstly, a multilayer spatial combat network model has been constructed, which incorporates an information layer depicting the interrelations among combat entities based on the OODA loop, along with a spatial layer that considers the spatial characteristics of equipment entities, thereby accurately reflecting the actual combat process. Secondly, building upon the combat network model, a spatiotemporal resilience optimization model is proposed, which reformulates the resilience optimization problem as a classical linear optimization model with spatial features. Furthermore, the model is extended from scenarios without obstacles to those with obstacles, thereby further emphasizing the importance of spatial characteristics. Thirdly, a resilience-oriented recovery optimization method based on improved non dominated sorting genetic algorithm II (R-INSGA) is proposed to determine the optimal recovery sequence for the damaged entities. This method not only considers spatial features but also provides the optimal travel path for multiple recovery teams. Finally, the feasibility, effectiveness, and superiority of the CSoS-STRE are demonstrated through a case study. Simultaneously, under deliberate attack conditions based on degree centrality and maximum operational loop performance, the proposed CSoS-STRE method is compared with six baseline recovery strategies, which are based on performance, time, degree centrality, betweenness centrality, closeness centrality, and eigenvector centrality. The comparison demonstrates that CSoS-STRE achieves faster convergence and superior performance.Keywords: space-time resilience enhancement, resilience optimization model, combat system-of-systems, recovery optimization method, no-obstacles and obstacles
Procedia PDF Downloads 1531391 The Study of Security Techniques on Information System for Decision Making
Authors: Tejinder Singh
Abstract:
Information system is the flow of data from different levels to different directions for decision making and data operations in information system (IS). Data can be violated by different manner like manual or technical errors, data tampering or loss of integrity. Security system called firewall of IS is effected by such type of violations. The flow of data among various levels of Information System is done by networking system. The flow of data on network is in form of packets or frames. To protect these packets from unauthorized access, virus attacks, and to maintain the integrity level, network security is an important factor. To protect the data to get pirated, various security techniques are used. This paper represents the various security techniques and signifies different harmful attacks with the help of detailed data analysis. This paper will be beneficial for the organizations to make the system more secure, effective, and beneficial for future decisions making.Keywords: information systems, data integrity, TCP/IP network, vulnerability, decision, data
Procedia PDF Downloads 30731390 Landscape Classification in North of Jordan by Integrated Approach of Remote Sensing and Geographic Information Systems
Authors: Taleb Odeh, Nizar Abu-Jaber, Nour Khries
Abstract:
The southern part of Wadi Al Yarmouk catchment area covers north of Jordan. It locates within latitudes 32° 20’ to 32° 45’N and longitudes 35° 42’ to 36° 23’ E and has an area of about 1426 km2. However, it has high relief topography where the elevation varies between 50 to 1100 meter above sea level. The variations in the topography causes different units of landforms, climatic zones, land covers and plant species. As a results of these different landscapes units exists in that region. Spatial planning is a major challenge in such a vital area for Jordan which could not be achieved without determining landscape units. However, an integrated approach of remote sensing and geographic information Systems (GIS) is an optimized tool to investigate and map landscape units of such a complicated area. Remote sensing has the capability to collect different land surface data, of large landscape areas, accurately and in different time periods. GIS has the ability of storage these land surface data, analyzing them spatially and present them in form of professional maps. We generated a geo-land surface data that include land cover, rock units, soil units, plant species and digital elevation model using ASTER image and Google Earth while analyzing geo-data spatially were done by ArcGIS 10.2 software. We found that there are twenty two different landscape units in the study area which they have to be considered for any spatial planning in order to avoid and environmental problems.Keywords: landscape, spatial planning, GIS, spatial analysis, remote sensing
Procedia PDF Downloads 52831389 Emerging Trends of Geographic Information Systems in Built Environment Education: A Bibliometric Review Analysis
Authors: Kiara Lawrence, Robynne Hansmann, Clive Greentsone
Abstract:
Geographic Information Systems (GIS) are used to store, analyze, visualize, capture and monitor geographic data. Built environment professionals as well as urban planners specifically, need to possess GIS skills to effectively and efficiently plan spaces. GIS application extends beyond the production of map artifacts and can be applied to relate to spatially referenced, real time data to support spatial visualization, analysis, community engagement, scenarios, and so forth. Though GIS has been used in the built environment for a few decades, its use in education has not been researched enough to draw conclusions on the trends in the last 20 years. The study looks to discover current and emerging trends of GIS in built environment education. A bibliometric review analysis methodology was carried out through exporting documents from Scopus and Web of Science using keywords around "Geographic information systems" OR "GIS" AND "built environment" OR “geography” OR "architecture" OR "quantity surveying" OR "construction" OR "urban planning" OR "town planning" AND “education” between the years 1994 to 2024. A total of 564 documents were identified and exported. The data was then analyzed using VosViewer software to generate network analysis and visualization maps on the co-occurrence of keywords, co-citation of documents and countries and co-author network analysis. By analyzing each aspect of the data, deeper insight of GIS within education can be understood. Preliminary results from Scopus indicate that GIS research focusing on built environment education seems to have peaked prior to 2014 with much focus on remote sensing, demography, land use, engineering education and so forth. This invaluable data can help in understanding and implementing GIS in built environment education in ways that are foundational and innovative to ensure that students are equipped with sufficient knowledge and skills to carry out tasks in their respective fields.Keywords: architecture, built environment, construction, education, geography, geographic information systems, quantity surveying, town planning, urban planning
Procedia PDF Downloads 1531388 Wolof Voice Response Recognition System: A Deep Learning Model for Wolof Audio Classification
Authors: Krishna Mohan Bathula, Fatou Bintou Loucoubar, FNU Kaleemunnisa, Christelle Scharff, Mark Anthony De Castro
Abstract:
Voice recognition algorithms such as automatic speech recognition and text-to-speech systems with African languages can play an important role in bridging the digital divide of Artificial Intelligence in Africa, contributing to the establishment of a fully inclusive information society. This paper proposes a Deep Learning model that can classify the user responses as inputs for an interactive voice response system. A dataset with Wolof language words ‘yes’ and ‘no’ is collected as audio recordings. A two stage Data Augmentation approach is adopted for enhancing the dataset size required by the deep neural network. Data preprocessing and feature engineering with Mel-Frequency Cepstral Coefficients are implemented. Convolutional Neural Networks (CNNs) have proven to be very powerful in image classification and are promising for audio processing when sounds are transformed into spectra. For performing voice response classification, the recordings are transformed into sound frequency feature spectra and then applied image classification methodology using a deep CNN model. The inference model of this trained and reusable Wolof voice response recognition system can be integrated with many applications associated with both web and mobile platforms.Keywords: automatic speech recognition, interactive voice response, voice response recognition, wolof word classification
Procedia PDF Downloads 11631387 Specialized Instruction: Teaching and Leading Diverse Learners
Authors: Annette G. Walters Ph.D.
Abstract:
With a global shortage of qualified educational professionals, school systems continue to struggle with adequate staffing. How might learning communities meet the needs of all students, in particular those with specialized needs. While the task may seem foreboding and certain factors may seem divergent, all are connected in the education of students. Special education has a significant impact on the teaching and learning experience of all students in an educational community. Even when there are concerted efforts at embracing learners with diverse aptitude and abilities, there are often many important local factors that are misaligned, overlooked, or misunderstood. Working with learners with diverse abilities, often requires intentional services and supports for students to achieve success. Developing and implementing specialized instruction requires a multifaceted approach to supports the entire learning community, which includes educational providers, learners, and families, all while being mindful of fiscal and natural resources. This research explores the implications and complexities of special education instruction and specializing instruction, as well as leading and teaching diverse learners. This work is separated into three sections: the state of special education, teaching and leading diverse learners, and developing educational competencies through collaborative engagement. This structured analysis extrapolates historical and current research on special education practices and the role of educators in ensuring diverse students meet success.Keywords: - diverse learners, - special education, - modification and supports, - curriculum and instruction, - classroom management, - formal and informal assessments
Procedia PDF Downloads 5531386 Enhanced Retrieval-Augmented Generation (RAG) Method with Knowledge Graph and Graph Neural Network (GNN) for Automated QA Systems
Authors: Zhihao Zheng, Zhilin Wang, Linxin Liu
Abstract:
In the research of automated knowledge question-answering systems, accuracy and efficiency are critical challenges. This paper proposes a knowledge graph-enhanced Retrieval-Augmented Generation (RAG) method, combined with a Graph Neural Network (GNN) structure, to automatically determine the correctness of knowledge competition questions. First, a domain-specific knowledge graph was constructed from a large corpus of academic journal literature, with key entities and relationships extracted using Natural Language Processing (NLP) techniques. Then, the RAG method's retrieval module was expanded to simultaneously query both text databases and the knowledge graph, leveraging the GNN to further extract structured information from the knowledge graph. During answer generation, contextual information provided by the knowledge graph and GNN is incorporated to improve the accuracy and consistency of the answers. Experimental results demonstrate that the knowledge graph and GNN-enhanced RAG method perform excellently in determining the correctness of questions, achieving an accuracy rate of 95%. Particularly in cases involving ambiguity or requiring contextual information, the structured knowledge provided by the knowledge graph and GNN significantly enhances the RAG method's performance. This approach not only demonstrates significant advantages in improving the accuracy and efficiency of automated knowledge question-answering systems but also offers new directions and ideas for future research and practical applications.Keywords: knowledge graph, graph neural network, retrieval-augmented generation, NLP
Procedia PDF Downloads 4031385 Social Collaborative Learning Model Based on Proactive Involvement to Promote the Global Merit Principle in Cultivating Youths' Morality
Authors: Wera Supa, Panita Wannapiroon
Abstract:
This paper is a report on the designing of the social collaborative learning model based on proactive involvement to Promote the global merit principle in cultivating youths’ morality. The research procedures into two phases, the first phase is to design the social collaborative learning model based on proactive involvement to promote the global merit principle in cultivating youths’ morality, and the second is to evaluate the social collaborative learning model based on proactive involvement. The sample group in this study consists of 15 experts who are dominant in proactive participation, moral merit principle and youths’ morality cultivation from executive level, lecturers and the professionals in information and communication technology expertise selected using the purposive sampling method. Data analyzed by arithmetic mean and standard deviation. This study has explored that there are four significant factors in promoting the hands-on collaboration of global merit scheme in order to implant virtues to adolescences which are: 1) information and communication Technology Usage; 2) proactive involvement; 3) morality cultivation policy, and 4) global merit principle. The experts agree that the social collaborative learning model based on proactive involvement is highly appropriate.Keywords: social collaborative learning, proactive involvement, global merit principle, morality
Procedia PDF Downloads 38831384 Digital Reconstruction of Museum's Statue Using 3D Scanner for Cultural Preservation in Indonesia
Authors: Ahmad Zaini, F. Muhammad Reza Hadafi, Surya Sumpeno, Muhtadin, Mochamad Hariadi
Abstract:
The lack of information about museum’s collection reduces the number of visits of museum. Museum’s revitalization is an urgent activity to increase the number of visits. The research's roadmap is building a web-based application that visualizes museum in the virtual form including museum's statue reconstruction in the form of 3D. This paper describes implementation of three-dimensional model reconstruction method based on light-strip pattern on the museum statue using 3D scanner. Noise removal, alignment, meshing and refinement model's processes is implemented to get a better 3D object reconstruction. Model’s texture derives from surface texture mapping between object's images with reconstructed 3D model. Accuracy test of dimension of the model is measured by calculating relative error of virtual model dimension compared against the original object. The result is realistic three-dimensional model textured with relative error around 4.3% to 5.8%.Keywords: 3D reconstruction, light pattern structure, texture mapping, museum
Procedia PDF Downloads 46531383 Artificial Neural Network Model Based Setup Period Estimation for Polymer Cutting
Authors: Zsolt János Viharos, Krisztián Balázs Kis, Imre Paniti, Gábor Belső, Péter Németh, János Farkas
Abstract:
The paper presents the results and industrial applications in the production setup period estimation based on industrial data inherited from the field of polymer cutting. The literature of polymer cutting is very limited considering the number of publications. The first polymer cutting machine is known since the second half of the 20th century; however, the production of polymer parts with this kind of technology is still a challenging research topic. The products of the applying industrial partner must met high technical requirements, as they are used in medical, measurement instrumentation and painting industry branches. Typically, 20% of these parts are new work, which means every five years almost the entire product portfolio is replaced in their low series manufacturing environment. Consequently, it requires a flexible production system, where the estimation of the frequent setup periods' lengths is one of the key success factors. In the investigation, several (input) parameters have been studied and grouped to create an adequate training information set for an artificial neural network as a base for the estimation of the individual setup periods. In the first group, product information is collected such as the product name and number of items. The second group contains material data like material type and colour. In the third group, surface quality and tolerance information are collected including the finest surface and tightest (or narrowest) tolerance. The fourth group contains the setup data like machine type and work shift. One source of these parameters is the Manufacturing Execution System (MES) but some data were also collected from Computer Aided Design (CAD) drawings. The number of the applied tools is one of the key factors on which the industrial partners’ estimations were based previously. The artificial neural network model was trained on several thousands of real industrial data. The mean estimation accuracy of the setup periods' lengths was improved by 30%, and in the same time the deviation of the prognosis was also improved by 50%. Furthermore, an investigation on the mentioned parameter groups considering the manufacturing order was also researched. The paper also highlights the manufacturing introduction experiences and further improvements of the proposed methods, both on the shop floor and on the quotation preparation fields. Every week more than 100 real industrial setup events are given and the related data are collected.Keywords: artificial neural network, low series manufacturing, polymer cutting, setup period estimation
Procedia PDF Downloads 24531382 Accounting Knowledge Management and Value Creation of SME in Chatuchak Market: Case Study Ceramics Product
Authors: Runglaksamee Rodkam
Abstract:
The purpose of this research was to study the influence of accountants’ potential performance on their working process, a case study of Government Savings Banks in the northeast of Thailand. The independent variables included accounting knowledge, accounting skill, accounting value, accounting ethics, and accounting attitude, while the dependent variable included the success of the working process. A total of 155 accountants working for Government Savings Banks were selected by random sampling. A questionnaire was used as a tool for collecting data. Descriptive statistics in this research included percentage, mean, and multiple regression analyses. The findings revealed that the majority of accountants were female with an age between 35-40 years old. Most of the respondents had an undergraduate degree with ten years of experience. Moreover, the factors of accounting knowledge, accounting skill, accounting a value and accounting ethics and accounting attitude were rated at a high level. The findings from regression analysis of observation data revealed a causal relationship in that the observation data could explain at least 51 percent of the success in the accountants’ working process.Keywords: influence, potential performance, success, working process
Procedia PDF Downloads 25631381 Evaluation of High Damping Rubber Considering Initial History through Dynamic Loading Test and Program Analysis
Authors: Kyeong Hoon Park, Taiji Mazuda
Abstract:
High damping rubber (HDR) bearings are dissipating devices mainly used in seismic isolation systems and have a great damping performance. Although many studies have been conducted on the dynamic model of HDR bearings, few models can reflect phenomena such as dependency of experienced shear strain on initial history. In order to develop a model that can represent the dependency of experienced shear strain of HDR by Mullins effect, dynamic loading test was conducted using HDR specimen. The reaction of HDR was measured by applying a horizontal vibration using a hybrid actuator under a constant vertical load. Dynamic program analysis was also performed after dynamic loading test. The dynamic model applied in program analysis is a bilinear type double-target model. This model is modified from typical bilinear model. This model can express the nonlinear characteristics related to the initial history of HDR bearings. Based on the dynamic loading test and program analysis results, equivalent stiffness and equivalent damping ratio were calculated to evaluate the mechanical properties of HDR and the feasibility of the bilinear type double-target model was examined.Keywords: base-isolation, bilinear model, high damping rubber, loading test
Procedia PDF Downloads 123