Search results for: gravitational search algorithm
3093 Students’ Perception of E-Learning Systems at Hashemite University
Authors: Muneer Abbad
Abstract:
In search of better, traditional learning universities have expanded their ways to deliver knowledge and integrate cost effective e-learning systems. Universities’ use of information and communication technologies has grown tremendously over the last decade. To ensure efficient use of the e-learning system, this project aimed to evaluate the good and bad practices, detect errors and determine areas for further improvements in usage. This project critically evaluated the students’ perception of the e-learning system and recommended changes to improve students’ e-learning usage, through conducting questionnaire given to the students that have experience with e-learning systems. Results of the study indicated that, in general, students have favourable perceptions toward using the e-learning system. They seemed to value the resources tool and its contribution to building their knowledge more than other e-learning tools. However, they seemed to perceive a limited value from the audio or video podcasts. This study has shown that technology acceptance is the most variable, factor that contributes to students’ perception and satisfaction of the e-learning system.Keywords: e-learning, perception, Jordan, universities
Procedia PDF Downloads 4913092 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models, and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. It is important to design systems based on structural analysis, research, and evaluation of efficiency indicators. One of the important efficiency criteria is the reliability of the system, which depends on the components of the structure. Quantifying the reliability of large-scale systems is a computationally complex process, and it is advisable to perform it with the help of a computer. Logical-probabilistic modeling is one of the effective means of describing the structure of a complex system and quantitatively evaluating its reliability, which was the basis of our application. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of “weights” of elements of system. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research, and designing of optimal structure systems are carried out.Keywords: complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability of systems, “weights” of elements
Procedia PDF Downloads 673091 Location3: A Location Scouting Platform for the Support of Film and Multimedia Industries
Authors: Dimitrios Tzilopoulos, Panagiotis Symeonidis, Michael Loufakis, Dimosthenis Ioannidis, Dimitrios Tzovaras
Abstract:
The domestic film industry in Greece has traditionally relied heavily on state support. While film productions are crucial for the country's economy, it has not fully capitalized on attracting and promoting foreign productions. The lack of motivation, organized state support for attraction and licensing, and the absence of location scouting have hindered its potential. Although recent legislative changes have addressed the first two of these issues, the development of a comprehensive location database and a search engine that would effectively support location scouting at the pre-production location scouting is still in its early stages. In addition to the expected benefits of the film, television, marketing, and multimedia industries, a location-scouting service platform has the potential to yield significant financial gains locally and nationally. By promoting featured places like cultural and archaeological sites, natural monuments, and attraction points for visitors, it plays a vital role in both cultural promotion and facilitating tourism development. This study introduces LOCATION3, an internet platform revolutionizing film production location management. It interconnects location providers, film crews, and multimedia stakeholders, offering a comprehensive environment for seamless collaboration. The platform's central geodatabase (PostgreSQL) stores each location’s attributes, while web technologies like HTML, JavaScript, CSS, React.js, and Redux power the user-friendly interface. Advanced functionalities, utilizing deep learning models, developed in Python, are integrated via Node.js. Visual data presentation is achieved using the JS Leaflet library, delivering an interactive map experience. LOCATION3 sets a new standard, offering a range of essential features to enhance the management of film production locations. Firstly, it empowers users to effortlessly upload audiovisual material enriched with geospatial and temporal data, such as location coordinates, photographs, videos, 360-degree panoramas, and 3D location models. With the help of cutting-edge deep learning algorithms, the application automatically tags these materials, while users can also manually tag them. Moreover, the application allows users to record locations directly through its user-friendly mobile application. Users can then embark on seamless location searches, employing spatial or descriptive criteria. This intelligent search functionality considers a combination of relevant tags, dominant colors, architectural characteristics, emotional associations, and unique location traits. One of the application's standout features is the ability to explore locations by their visual similarity to other materials, facilitated by a reverse image search. Also, the interactive map serves as both a dynamic display for locations and a versatile filter, adapting to the user's preferences and effortlessly enhancing location searches. To further streamline the process, the application facilitates the creation of location lightboxes, enabling users to efficiently organize and share their content via email. Going above and beyond location management, the platform also provides invaluable liaison, matchmaking, and online marketplace services. This powerful functionality bridges the gap between visual and three-dimensional geospatial material providers, local agencies, film companies, production companies, etc. so that those interested in a specific location can access additional material beyond what is stored on the platform, as well as access production services supporting the functioning and completion of productions in a location (equipment provision, transportation, catering, accommodation, etc.).Keywords: deep learning models, film industry, geospatial data management, location scouting
Procedia PDF Downloads 733090 The Use of Social Stories and Digital Technology as Interventions for Autistic Children; A State-Of-The-Art Review and Qualitative Data Analysis
Authors: S. Hussain, C. Grieco, M. Brosnan
Abstract:
Background and Aims: Autism is a complex neurobehavioural disorder, characterised by impairments in the development of language and communication skills. The study involved a state-of-art systematic review, in addition to qualitative data analysis, to establish the evidence for social stories as an intervention strategy for autistic children. An up-to-date review of the use of digital technologies in the delivery of interventions to autistic children was also carried out; to propose the efficacy of digital technologies and the use of social stories to improve intervention outcomes for autistic children. Methods: Two student researchers reviewed a range of randomised control trials and observational studies. The aim of the review was to establish if there was adequate evidence to justify recommending social stories to autistic patients. Students devised their own search strategies to be used across a range of search engines, including Ovid-Medline, Google Scholar and PubMed. Students then critically appraised the generated literature. Additionally, qualitative data obtained from a comprehensive online questionnaire on social stories was also thematically analysed. The thematic analysis was carried out independently by each researcher, using a ‘bottom-up’ approach, meaning contributors read and analysed responses to questions and devised semantic themes from reading the responses to a given question. The researchers then placed each response into a semantic theme or sub-theme. The students then joined to discuss the merging of their theme headings. The Inter-rater reliability (IRR) was calculated before and after theme headings were merged, giving IRR for pre- and post-discussion. Lastly, the thematic analysis was assessed by a third researcher, who is a professor of psychology and the director for the ‘Centre for Applied Autism Research’ at the University of Bath. Results: A review of the literature, as well as thematic analysis of qualitative data found supporting evidence for social story use. The thematic analysis uncovered some interesting themes from the questionnaire responses, relating to the reasons why social stories were used and the factors influencing their effectiveness in each case. However, overall, the evidence for digital technologies interventions was limited, and the literature could not prove a causal link between better intervention outcomes for autistic children and the use of technologies. However, they did offer valid proposed theories for the suitability of digital technologies for autistic children. Conclusions: Overall, the review concluded that there was adequate evidence to justify advising the use of social stories with autistic children. The role of digital technologies is clearly a fast-emerging field and appears to be a promising method of intervention for autistic children; however, it should not yet be considered an evidence-based approach. The students, using this research, developed ideas on social story interventions which aim to help autistic children.Keywords: autistic children, digital technologies, intervention, social stories
Procedia PDF Downloads 1223089 An Overview of Evaluations Using Augmented Reality for Assembly Training Tasks
Authors: S. Werrlich, E. Eichstetter, K. Nitsche, G. Notni
Abstract:
Augmented Reality (AR) is a strong growing research topic in different training domains such as medicine, sports, military, education and industrial use cases like assembly and maintenance tasks. AR claims to improve the efficiency and skill-transfer of training tasks. This paper gives a comprehensive overview of evaluations using AR for assembly and maintenance training tasks published between 1992 and 2017. We search in a structured way in four different online databases and get 862 results. We select 17 relevant articles focusing on evaluating AR-based training applications for assembly and maintenance tasks. This paper also indicates design guidelines which are necessary for creating a successful application for an AR-based training. We also present five scientific limitations in the field of AR-based training for assembly tasks. Finally, we show our approach to solve current research problems using Design Science Research (DSR).Keywords: assembly, augmented reality, survey, training
Procedia PDF Downloads 2823088 Category-Base Theory of the Optimum Signal Approximation Clarifying the Importance of Parallel Worlds in the Recognition of Human and Application to Secure Signal Communication with Feedback
Authors: Takuro Kida, Yuichi Kida
Abstract:
We show a base of the new trend of algorithm mathematically that treats a historical reason of continuous discrimination in the world as well as its solution by introducing new concepts of parallel world that includes an invisible set of errors as its companion. With respect to a matrix operator-filter bank that the matrix operator-analysis-filter bank H and the matrix operator-sampling-filter bank S are given, firstly, we introduce the detailed algorithm to derive the optimum matrix operator-synthesis-filter bank Z that minimizes all the worst-case measures of the matrix operator-error-signals E(ω) = F(ω) − Y(ω) between the matrix operator-input-signals F(ω) and the matrix operator-output signals Y(ω) of the matrix operator-filter bank at the same time. Further, feedback is introduced to the above approximation theory and it is indicated that introducing conversations with feedback does not superior automatically to the accumulation of existing knowledge of signal prediction. Secondly, the concept of category in the field of mathematics is applied to the above optimum signal approximation and is indicated that the category-based approximation theory is applied to the set-theoretic consideration of the recognition of humans. Based on this discussion, it is shown naturally why the narrow perception that tends to create isolation shows an apparent advantage in the short term and, often, why such narrow thinking becomes intimate with discriminatory action in a human group. Throughout these considerations, it is presented that, in order to abolish easy and intimate discriminatory behavior, it is important to create a parallel world of conception where we share the set of invisible error signals, including the words and the consciousness of both worlds.Keywords: signal prediction, pseudo inverse matrix, artificial intelligence, conditional optimization
Procedia PDF Downloads 1583087 A Strength Weaknesses Opportunities and Threats Analysis of Socialisation Externalisation Combination and Internalisation Modes in Knowledge Management Practice: A Systematic Review of Literature
Authors: Aderonke Olaitan Adesina
Abstract:
Background: The paradigm shift to knowledge, as the key to organizational innovation and competitive advantage, has made the management of knowledge resources in organizations a mandate. A key component of the knowledge management (KM) cycle is knowledge creation, which is researched to be the result of the interaction between explicit and tacit knowledge. An effective knowledge creation process requires the use of the right model. The SECI (Socialisation, Externalisation, Combination, and Internalisation) model, proposed in 1995, is attested to be a preferred model of choice for knowledge creation activities. The model has, however, been criticized by researchers, who raise their concern, especially about its sequential nature. Therefore, this paper reviews extant literature on the practical application of each mode of the SECI model, from 1995 to date, with a view to ascertaining the relevance in modern-day KM practice. The study will establish the trends of use, with regards to the location and industry of use, and the interconnectedness of the modes. The main research question is, for organizational knowledge creation activities, is the SECI model indeed linear and sequential? In other words, does the model need to be reviewed in today’s KM practice? The review will generate a compendium of the usage of the SECI modes and propose a framework of use, based on the strength weaknesses opportunities and threats (SWOT) findings of the study. Method: This study will employ the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) methodology to investigate the usage and SWOT of the modes, in order to ascertain the success, or otherwise, of the sequential application of the modes in practice from 1995 to 2019. To achieve the purpose, four databases will be explored to search for open access, peer-reviewed articles from 1995 to 2019. The year 1995 is chosen as the baseline because it was the year the first paper on the SECI model was published. The study will appraise relevant peer-reviewed articles under the search terms: SECI (or its synonym, knowledge creation theory), socialization, externalization, combination, and internalization in the title, abstract, or keywords list. This review will include only empirical studies of knowledge management initiatives in which the SECI model and its modes were used. Findings: It is expected that the study will highlight the practical relevance of each mode of the SECI model, the linearity or not of the model, the SWOT in each mode. Concluding Statement: Organisations can, from the analysis, determine the modes of emphasis for their knowledge creation activities. It is expected that the study will support decision making in the choice of the SECI model as a strategy for the management of organizational knowledge resources, and in appropriating the SECI model, or its remodeled version, as a theoretical framework in future KM research.Keywords: combination, externalisation, internalisation, knowledge management, SECI model, socialisation
Procedia PDF Downloads 3583086 Low Overhead Dynamic Channel Selection with Cluster-Based Spatial-Temporal Station Reporting in Wireless Networks
Authors: Zeyad Abdelmageid, Xianbin Wang
Abstract:
Choosing the operational channel for a WLAN access point (AP) in WLAN networks has been a static channel assignment process initiated by the user during the deployment process of the AP, which fails to cope with the dynamic conditions of the assigned channel at the station side afterward. However, the dramatically growing number of Wi-Fi APs and stations operating in the unlicensed band has led to dynamic, distributed, and often severe interference. This highlights the urgent need for the AP to dynamically select the best overall channel of operation for the basic service set (BSS) by considering the distributed and changing channel conditions at all stations. Consequently, dynamic channel selection algorithms which consider feedback from the station side have been developed. Despite the significant performance improvement, existing channel selection algorithms suffer from very high feedback overhead. Feedback latency from the STAs, due to the high overhead, can cause the eventually selected channel to no longer be optimal for operation due to the dynamic sharing nature of the unlicensed band. This has inspired us to develop our own dynamic channel selection algorithm with reduced overhead through the proposed low-overhead, cluster-based station reporting mechanism. The main idea behind the cluster-based station reporting is the observation that STAs which are very close to each other tend to have very similar channel conditions. Instead of requesting each STA to report on every candidate channel while causing high overhead, the AP divides STAs into clusters then assigns each STA in each cluster one channel to report feedback on. With the proper design of the cluster based reporting, the AP does not lose any information about the channel conditions at the station side while reducing feedback overhead. The simulation results show equal performance and, at times, better performance with a fraction of the overhead. We believe that this algorithm has great potential in designing future dynamic channel selection algorithms with low overhead.Keywords: channel assignment, Wi-Fi networks, clustering, DBSCAN, overhead
Procedia PDF Downloads 1213085 The Impact of Physical Exercise on Gestational Diabetes and Maternal Weight Management: A Meta-Analysis
Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei
Abstract:
Physiological changes during pregnancy, such as alterations in the circulatory, respiratory, and musculoskeletal systems, can negatively impact daily physical activity. This reduced activity is often associated with an increased risk of adverse maternal health outcomes, particularly gestational diabetes mellitus (GDM) and excessive weight gain. This meta-analysis aims to evaluate the effectiveness of structured physical exercise interventions during pregnancy in reducing the risk of GDM and managing maternal weight gain. A comprehensive search was conducted across six major databases: PubMed, Cochrane Library, EMBASE, Web of Science, ScienceDirect, and ClinicalTrials.gov, covering the period from database inception until 2023. Randomized controlled trials (RCTs) that explored the effects of physical exercise programs on pregnant women with low physical activity levels were included. The search was performed using EndNote and results were managed using RevMan (Review Manager) for meta-analysis. RCTs involving healthy pregnant women with low levels of physical activity or sedentary lifestyles were selected. These RCTs must have incorporated structured exercise programs during pregnancy and reported on outcomes related to GDM and maternal weight gain. From an initial pool of 5,112 articles, 65 RCTs (involving 11,400 pregnant women) met the inclusion criteria. Data extraction was performed, followed by a quality assessment of the selected studies using the Cochrane Risk of Bias tool. The meta-analysis was conducted using RevMan software, where pooled relative risks (RR) and weighted mean differences (WMD) were calculated using a random-effects model to address heterogeneity across studies. Sensitivity analyses, subgroup analyses (based on factors such as exercise intensity, duration, and pregnancy stage), and publication bias assessments were also conducted. Structured physical exercise during pregnancy led to a significant reduction in the risk of developing GDM (RR = 0.68; P < 0.001), particularly when the exercise program was performed throughout the pregnancy (RR = 0.62; P = 0.035). In addition, maternal weight gain was significantly reduced (WMD = −1.18 kg; 95% CI −1.54 to −0.85; P < 0.001). There were no significant adverse effects reported for either the mother or the neonate, confirming that exercise interventions are safe for both. This meta-analysis highlights the positive impact of regular moderate physical activity during pregnancy in reducing the risk of GDM and managing maternal weight gain. These findings suggest that physical exercise should be encouraged as a routine part of prenatal care. However, more research is required to refine exercise recommendations and determine the most effective interventions based on individual risk factors and pregnancy stages.Keywords: gestational diabetes, maternal weight management, meta-analysis, randomized controlled trials
Procedia PDF Downloads 183084 An Analysis on Clustering Based Gene Selection and Classification for Gene Expression Data
Authors: K. Sathishkumar, V. Thiagarasu
Abstract:
Due to recent advances in DNA microarray technology, it is now feasible to obtain gene expression profiles of tissue samples at relatively low costs. Many scientists around the world use the advantage of this gene profiling to characterize complex biological circumstances and diseases. Microarray techniques that are used in genome-wide gene expression and genome mutation analysis help scientists and physicians in understanding of the pathophysiological mechanisms, in diagnoses and prognoses, and choosing treatment plans. DNA microarray technology has now made it possible to simultaneously monitor the expression levels of thousands of genes during important biological processes and across collections of related samples. Elucidating the patterns hidden in gene expression data offers a tremendous opportunity for an enhanced understanding of functional genomics. However, the large number of genes and the complexity of biological networks greatly increase the challenges of comprehending and interpreting the resulting mass of data, which often consists of millions of measurements. A first step toward addressing this challenge is the use of clustering techniques, which is essential in the data mining process to reveal natural structures and identify interesting patterns in the underlying data. This work presents an analysis of several clustering algorithms proposed to deals with the gene expression data effectively. The existing clustering algorithms like Support Vector Machine (SVM), K-means algorithm and evolutionary algorithm etc. are analyzed thoroughly to identify the advantages and limitations. The performance evaluation of the existing algorithms is carried out to determine the best approach. In order to improve the classification performance of the best approach in terms of Accuracy, Convergence Behavior and processing time, a hybrid clustering based optimization approach has been proposed.Keywords: microarray technology, gene expression data, clustering, gene Selection
Procedia PDF Downloads 3253083 Shade Effect on Photovoltaic Systems: A Comparison between String and Module-Based Solution
Authors: Iyad M. Muslih, Yehya Abdellatif
Abstract:
In general, shading will reduce the electrical power produced from PV modules and arrays in locations where shading is unavoidable or caused by dynamic moving parts. This reduction is based on the shade effect on the I-V curve of the PV module or array and how the DC/AC inverter can search and control the optimum value of power from this module or array configuration. This is a very complicated task due to different patterns of shaded PV modules and arrays. One solution presented by the inverter industry is to perform the maximum power point tracking (MPPT) at the module level rather than the series string level. This solution is supposed to reduce the shade effect on the total harvested energy. However, this isn’t necessarily the best solution to reduce the shade effect as will be shown in this study.Keywords: photovoltaic, shade effect, I-V curve, MPPT
Procedia PDF Downloads 4143082 Procedure for Recommendation of Archival Documents
Authors: Marlon J. Remedios, Maria T. Morell, Jesse D. Cano
Abstract:
Diffusion and accessibility of historical collections is one of the main objectives of the institutions that aim to safeguard archival documents (General Archives). Several countries have Web applications that try to make accessible and public the large number of documents that they guard. Each of these sites has a set of features in order to facilitate access, navigability, and search for information. Different sources of information include Recommender Systems as a way of customizing content. This paper aims at describing a process for the production of archival documents relevant to the user. To comply with this, the characteristics ruling archival description, elements and main techniques that establishes the design of Recommender Systems, a set of rules to follow, and how these rules operate and the way in which take advantage of the domain knowledge are discussed. Finally, relevant issues are discussed in the design of the proposed tests and the results obtained are shown.Keywords: archival document, recommender system, procedure, information management
Procedia PDF Downloads 5213081 The Use of Platelet-rich Plasma in the Treatment of Diabetic Foot Ulcers: A Scoping Review
Authors: Kiran Sharma, Viktor Kunder, Zerha Rizvi, Ricardo Soubelet
Abstract:
Platelet rich plasma (PRP) has been recognized as a method of treatment in medicine since the 1980s. It primarily functions by releasing cytokines and growth factors that promote wound healing; these growth promoting factors released by PRP enact new processes such as angiogenesis, collagen deposition, and tissue formation that can change wound healing outcomes. Many studies recognize that PRP aids in chronic wound healing, which is advantageous for patients who suffer from chronic diabetic foot ulcers (DFUs). This scoping review aims to examine literature to identify the efficacy of PRP use in the healing of DFUs. Following PRISMA guidelines, we searched randomized-controlled trials involving PRP use in diabetic patients with foot ulcers using PubMed, Medline, CINAHL Complete, and Cochrane Database of Systematic Reviews. We restricted the search to articles published during 2005-2022, full texts in the English language, articles involving patients aged 19 years or older, articles that used PRP on specifically DFUs, articles that included a control group, articles on human subjects. The initial search yielded 119 articles after removing duplicates. Final analysis for relevance yielded 8 articles. In all cases except one, the PRP group showed either faster healing, more complete healing, or a larger percentage of healed participants. There were no situations in the included studies where the control group had a higher rate of healing or decreased wound size as compared to a group with isolated PRP-only use. Only one study did not show conclusive evidence that PRP caused accelerated healing in DFUs, and this study did not have an isolated PRP variable group. Application styles of PRP for treatment were shown to influence the level of healing in patients, with injected PRP appearing to achieve the best results as compared to topical PRP application. However, this was not conclusive due to the involvement of several other variables. Two studies additionally found PRP to be useful in healing refractory DFUs, and one study found that PRP use in patients with additional comorbidities was still more effective in healing DFUs than the standard control groups. The findings of this review suggest that PRP is a useful tool in reducing healing times and improving rates of complete wound healing in DFUs. There is room for further research in the application styles of PRP before conclusive statements can be made on the efficacy of injected versus topical PRP healing based on the findings in this study. The results of this review provide a baseline for further research in PRP use in diabetic patients and can be used by both physicians and public health experts to guide future treatment options for DFUs.Keywords: diabetic foot ulcer, DFU, platelet rich plasma, PRP
Procedia PDF Downloads 763080 Production of Biodiesel Using Brine Waste as a Heterogeneous Catalyst
Authors: Hilary Rutto, Linda Sibali
Abstract:
In these modern times, we constantly search for new and innovative technologies to lift the burden of our extreme energy demand. The overall purpose of biofuel production research is to source an alternative energy source to replace the normal use of fossil fuel as liquid petroleum products. This experiment looks at the basis of biodiesel production with regards to alternative catalysts that can be used to produce biodiesel. The key factors that will be addressed during the experiments will focus on temperature variation, catalyst additions to the overall reaction, methanol to oil ratio, and the impact of agitation on the reaction. Brine samples sources from nearby plants will be evaluated and tested thoroughly and the key characteristics of these brine samples analysed for the verification of its use as a possible catalyst in biodiesel production. The one factor at a time experimental approach was used in this experiment, and the recycle and reuse characteristics of the heterogeneous catalyst was evaluated.Keywords: brine sludge, heterogenous catalyst, biodiesel, one factor
Procedia PDF Downloads 1733079 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform
Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee
Abstract:
This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.Keywords: Boid Algorithm, Crowd Simulation, Mobile Platform, Newtonian Laws, Virtual Heritage
Procedia PDF Downloads 2783078 Construction and Analysis of Tamazight (Berber) Text Corpus
Authors: Zayd Khayi
Abstract:
This paper deals with the construction and analysis of the Tamazight text corpus. The grammatical structure of the Tamazight remains poorly understood, and a lack of comparative grammar leads to linguistic issues. In order to fill this gap, even though it is small, by constructed the diachronic corpus of the Tamazight language, and elaborated the program tool. In addition, this work is devoted to constructing that tool to analyze the different aspects of the Tamazight, with its different dialects used in the north of Africa, specifically in Morocco. It also focused on three Moroccan dialects: Tamazight, Tarifiyt, and Tachlhit. The Latin version was good choice because of the many sources it has. The corpus is based on the grammatical parameters and features of that language. The text collection contains more than 500 texts that cover a long historical period. It is free, and it will be useful for further investigations. The texts were transformed into an XML-format standardization goal. The corpus counts more than 200,000 words. Based on the linguistic rules and statistical methods, the original user interface and software prototype were developed by combining the technologies of web design and Python. The corpus presents more details and features about how this corpus provides users with the ability to distinguish easily between feminine/masculine nouns and verbs. The interface used has three languages: TMZ, FR, and EN. Selected texts were not initially categorized. This work was done in a manual way. Within corpus linguistics, there is currently no commonly accepted approach to the classification of texts. Texts are distinguished into ten categories. To describe and represent the texts in the corpus, we elaborated the XML structure according to the TEI recommendations. Using the search function may provide us with the types of words we would search for, like feminine/masculine nouns and verbs. Nouns are divided into two parts. The gender in the corpus has two forms. The neutral form of the word corresponds to masculine, while feminine is indicated by a double t-t affix (the prefix t- and the suffix -t), ex: Tarbat (girl), Tamtut (woman), Taxamt (tent), and Tislit (bride). However, there are some words whose feminine form contains only the prefix t- and the suffix –a, ex: Tasa (liver), tawja (family), and tarwa (progenitors). Generally, Tamazight masculine words have prefixes that distinguish them from other words. For instance, 'a', 'u', 'i', ex: Asklu (tree), udi (cheese), ighef (head). Verbs in the corpus are for the first person singular and plural that have suffixes 'agh','ex', 'egh', ex: 'ghrex' (I study), 'fegh' (I go out), 'nadagh' (I call). The program tool permits the following characteristics of this corpus: list of all tokens; list of unique words; lexical diversity; realize different grammatical requests. To conclude, this corpus has only focused on a small group of parts of speech in Tamazight language verbs, nouns. Work is still on the adjectives, prounouns, adverbs and others.Keywords: Tamazight (Berber) language, corpus linguistic, grammar rules, statistical methods
Procedia PDF Downloads 693077 The Impact of Non-Interest Banking on Economic Development in Nigeria
Authors: Oduntan Kemi Olalekan
Abstract:
Nigeria as the largest economy in Africa is still in its developing stage as its economy cannot be termed developed; it is still in search of economic policy that will positively affect the life of majority of her citizenry. Several policies have been employed to take care of the situation prominent among which is Structural Adjustment Programme (SAP) of Babangida Administration but it could not rescue the economy. Non-interest Banking otherwise known as Islamic Banking has been suggested as a means of developing Nigerian economy as it will enable more Nigerian have access to working capital and contribute positively to the growth of her economy. The paper investigated the level of Nigeria economic development and gave an overview of economic policies since independence, traced the genesis of non-interest banking in Nigeria and made recommendations on the adoption of the policy as an antidote to Nigeria economic development.Keywords: economic development, Nigerian economy, non-interest banking, working capital, Islamic banking.
Procedia PDF Downloads 3953076 Overview and Pathophysiology of Radiation-Induced Breast Changes as a Consequence of Radiotherapy Toxicity
Authors: Monika Rezacova
Abstract:
Radiation-induced breast changes are a consequence of radiotherapy toxicity over the breast tissues either related to targeted breast cancer treatment or other thoracic malignancies (eg. lung cancer). This study has created an overview of different changes and their pathophysiology. The main conditions included were skin thickening, interstitial oedema, fat necrosis, dystrophic calcifications, skin retractions, glandular atrophy, breast fibrosis and radiation induced breast cancer. This study has performed focused literature search through multiple databases including pubmed, medline and embase. The study has reviewed English as well as non English publications. As a result of the literature the study provides comprehensive overview of radiation-induced breast changes and their pathophysiology with small focus on new development and prevention.Keywords: radiotherapy toxicity, breast tissue changes, breast cancer treatment, radiation-induced breast changes
Procedia PDF Downloads 1613075 Determination of Genetic Markers, Microsatellites Type, Liked to Milk Production Traits in Goats
Authors: Mohamed Fawzy Elzarei, Yousef Mohammed Al-Dakheel, Ali Mohamed Alseaf
Abstract:
Modern molecular techniques, like single marker analysis for linked traits to these markers, can provide us with rapid and accurate genetic results. In the last two decades of the last century, the applications of molecular techniques were reached a faraway point in cattle, sheep, and pig. In goats, especially in our region, the application of molecular techniques is still far from other species. As reported by many researchers, microsatellites marker is one of the suitable markers for lie studies. The single marker linked to traits of interest is one technique allowed us to early select animals without the necessity for mapping the entire genome. Simplicity, applicability, and low cost of this technique gave this technique a wide range of applications in many areas of genetics and molecular biology. Also, this technique provides a useful approach for evaluating genetic differentiation, particularly in populations that are poorly known genetically. The expected breeding value (EBV) and yield deviation (YD) are considered as the most parameters used for studying the linkage between quantitative characteristics and molecular markers, since these values are raw data corrected for the non-genetic factors. A total of 17 microsatellites markers (from chromosomes 6, 14, 18, 20 and 23) were used in this study to search for areas that could be responsible for genetic variability for some milk traits and search of chromosomal regions that explain part of the phenotypic variance. Results of single-marker analyses were used to identify the linkage between microsatellite markers and variation in EBVs of these traits, Milk yield, Protein percentage, Fat percentage, Litter size and weight at birth, and litter size and weight at weaning. The estimates of the parameters from forward and backward solutions using stepwise regression procedure on milk yield trait, only two markers, OARCP9 and AGLA29, showed a highly significant effect (p≤0.01) in backward and forward solutions. The forward solution for different equations conducted that R2 of these equations were highly depending on only two partials regressions coefficient (βi,) for these markers. For the milk protein trait, four marker showed significant effect BMS2361, CSSM66 (p≤0.01), BMS2626, and OARCP9 (p≤0.05). By the other way, four markers (MCM147, BM1225, INRA006, andINRA133) showed highly significant effect (p≤0.01) in both backward and forward solutions in association with milk fat trait. For both litter size at birth and at weaning traits, only one marker (BM143(p≤0.01) and RJH1 (p≤0.05), respectively) showed a significant effect in backward and forward solutions. The estimates of the parameters from forward and backward solution using stepwise regression procedure on litter weight at birth (LWB) trait only one marker (MCM147) showed highly significant effect (p≤0.01) and two marker (ILSTS011, CSSM66) showed a significant effect (p≤0.05) in backward and forward solutions.Keywords: microsatellites marker, estimated breeding value, stepwise regression, milk traits
Procedia PDF Downloads 943074 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm
Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra
Abstract:
With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction
Procedia PDF Downloads 1273073 Molecular Defects Underlying Genital Ambiguity in Egyptian Patients: A Systematic Review
Authors: Y. Z. Gad
Abstract:
Disorders of Sex Development (DSD) are defined as congenital conditions in which development of chromosomal, gonadal or anatomical sex is atypical. The DSD are relatively prevalent in Egypt. In spite of that, the relative rarity of the individual disease types or their molecular pathologies frequently resulted in reporting on single or few cases. This augmented the challenging nature of phenotype-genotype correlation in this disease group and its utilization in the management of such medical emergency. Through critical assessment of the published DSD reports, the current review aims at analyzing the clinical characteristics of the various DSD forms in relation to the underlying molecular pathologies. A systematic literature search was done in Pubmed, using relevant keywords (Egypt versus DSD, genital ambiguity or ambiguous genitalia, the old terms of 'intersex, hermaphroditism and pseudohermaphroditism', and a list of the DSD entities and their related genes). The search yielded 24 reports of molecular data in Egyptian patients presenting with ambiguous genitalia. However, only 21 publications fulfilled the criteria of inclusion of detailed clinical descriptions and definitive molecular diagnoses of individual patients. Curation of the data yielded a total of 53 cases that were ascertained from 40 families. Fifty-one patients present with ambiguous genitalia only while 2 had multiple congenital anomalies. Parental consanguinity was noted in 60% of cases. Sex of rearing at initial presentation was female in 75% and 60% in 46,XY and 46,XX DSD cases, respectively. The external genital phenotype in 2/3 of the 46,XY DSD cases showed moderate undermasculinization [Quigley scores 3 & 4] and 1/3 had severe presentations [scores 5 & 6]. For 46,XX subjects, 1 had severe virilization of the external genitalia while 8 had moderate phenotype. Hormonal data were inconclusive or contradictory to final diagnosis in a forth of cases. Collectively, 31 families [31/40, 77.5%] with 46,XY DSD had molecular defects in the genes, 5 alpha reductase 2 (SRD5A2) [12/31], 17 beta-hydroxysteroid dehydrogenase 3 [8/31], androgen receptor [7/31], Steroidogenic factor 1 [2/31], luteinizing hormone receptor [1/31], and fibroblast growth factor receptor 1 [1/31]. In a multiethnic study, 9 families afflicted with 46,XX DSD due to 11 beta hydroxylase (CYP11B1) deficiency were documented. Two recurrent mutations, G34R and N160D, in SRD5A2 were present, respectively, in 42 and 17% of cases. Similarly, 4 recurrent mutations resulted in 89% of the CYP11B1 presentations. In conclusion, this analysis highlights the importance of autosomal recessive inheritance and inbreeding among DSD presentations, the importance of founder effect in at least 2 disorders, the difficulties in relating the genotype with the indeterminate genital phenotype, the under-reporting of some DSD subtypes, and the notion that the reported mutational profiles among Egyptian DSD cases are relatively different from those reported in other ethnic groups.Keywords: disorders of sex development, genital ambiguity, mutation, molecular diagnosis, Egypt
Procedia PDF Downloads 1393072 Research Opportunities in Business Process Management and Performance Measurement from a Constructivist View
Authors: R.T.O. Lacerda, L. Ensslin., S.R. Ensslin, L. Knoff
Abstract:
This research paper aims to discover research opportunities in business process management and performance measurement from a constructivist view. The nature of this research is exploratory and descriptive and the research method was performed in a qualitative way. The process narrowed down 2142 articles, gathered after a search in scientific databases, and identified 16 articles that were relevant to the research and highly cited. The analysis found that most of the articles uses realistic approach and there is a need to analyze the decision making process in a singular manner. The measurement criteria are identified from scientific literature searching, in most cases, using ordinal scale without any integration process to present the results to the decision maker. Regarding management aspects, most of the articles do not have a structured process to measure the current situation and generate improvements opportunities.Keywords: performance measurement, BPM, decision, research opportunities
Procedia PDF Downloads 3143071 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm
Authors: Annalakshmi G., Sakthivel Murugan S.
Abstract:
This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization
Procedia PDF Downloads 1643070 The Design and Implementation of an Enhanced 2D Mesh Switch
Authors: Manel Langar, Riad Bourguiba, Jaouhar Mouine
Abstract:
In this paper, we propose the design and implementation of an enhanced wormhole virtual channel on chip router. It is a heart of a mesh NoC using the XY deterministic routing algorithm. It is characterized by its simple virtual channel allocation strategy which allows reducing area and complexity of connections without affecting the performance. We implemented our router on a Tezzaron process to validate its performances. This router is a basic element that will be used later to design a 3D mesh NoC.Keywords: NoC, mesh, router, 3D NoC
Procedia PDF Downloads 5693069 Investigating the Algorithm to Maintain a Constant Speed in the Wankel Engine
Authors: Adam Majczak, Michał Bialy, Zbigniew Czyż, Zdzislaw Kaminski
Abstract:
Increasingly stringent emission standards for passenger cars require us to find alternative drives. The share of electric vehicles in the sale of new cars increases every year. However, their performance and, above all, range cannot be today successfully compared to those of cars with a traditional internal combustion engine. Battery recharging lasts hours, which can be hardly accepted due to the time needed to refill a fuel tank. Therefore, the ways to reduce the adverse features of cars equipped with electric motors only are searched for. One of the methods is a combination of an electric engine as a main source of power and a small internal combustion engine as an electricity generator. This type of drive enables an electric vehicle to achieve a radically increased range and low emissions of toxic substances. For several years, the leading automotive manufacturers like the Mazda and the Audi together with the best companies in the automotive industry, e.g., AVL have developed some electric drive systems capable of recharging themselves while driving, known as a range extender. An electricity generator is powered by a Wankel engine that has seemed to pass into history. This low weight and small engine with a rotating piston and a very low vibration level turned out to be an excellent source in such applications. Its operation as an energy source for a generator almost entirely eliminates its disadvantages like high fuel consumption, high emission of toxic substances, or short lifetime typical of its traditional application. The operation of the engine at a constant rotational speed enables a significant increase in its lifetime, and its small external dimensions enable us to make compact modules to drive even small urban cars like the Audi A1 or the Mazda 2. The algorithm to maintain a constant speed was investigated on the engine dynamometer with an eddy current brake and the necessary measuring apparatus. The research object was the Aixro XR50 rotary engine with the electronic power supply developed at the Lublin University of Technology. The load torque of the engine was altered during the research by means of the eddy current brake capable of giving any number of load cycles. The parameters recorded included speed and torque as well as a position of a throttle in an inlet system. Increasing and decreasing load did not significantly change engine speed, which means that control algorithm parameters are correctly selected. This work has been financed by the Polish Ministry of Science and Higher Education.Keywords: electric vehicle, power generator, range extender, Wankel engine
Procedia PDF Downloads 1573068 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series
Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold
Abstract:
To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network
Procedia PDF Downloads 1413067 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble
Authors: Jaehong Yu, Seoung Bum Kim
Abstract:
Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking
Procedia PDF Downloads 3403066 Low-Cost Fog Edge Computing for Smart Power Management and Home Automation
Authors: Belkacem Benadda, Adil Benabdellah, Boutheyna Souna
Abstract:
The Internet of Things (IoT) is an unprecedented creation. Electronics objects are now able to interact, share, respond and adapt to their environment on a much larger basis. Actual spread of these modern means of connectivity and solutions with high data volume exchange are affecting our ways of life. Accommodation is becoming an intelligent living space, not only suited to the people circumstances and desires, but also to systems constraints to make daily life simpler, cheaper, increase possibilities and achieve a higher level of services and luxury. In this paper we are as Internet access, teleworking, consumption monitoring, information search, etc.). This paper addresses the design and integration of a smart home, it also purposes an IoT solution that allows smart power consumption based on measurements from power-grid and deep learning analysis.Keywords: array sensors, IoT, power grid, FPGA, embedded
Procedia PDF Downloads 1173065 From the Fields to the Concrete: Urban Development of Campo Mourão
Authors: Caio Fialho
Abstract:
The automobile incentive policy in Brazil since the 1950s creates several problems in its cities, more visible in large centers such as São Paulo or Rio de Janeiro, but also strongly present in smaller cities, resulting in an increase in social and spatial inequality, together with a drop in the quality of life. The analyzed city, Campo Mourão, reflects these policies, a city that initially planned to be compact and walkable took other directions and currently suffers from urban mobility and social inequality in this urban environment, despite being a medium-sized city in Brazil. The research aims to understand and diagnose how these policies shaped the city and what are the results in Brazilian's inland cities. Based on historical, bibliographical, and field research in the city, the result is a diagnosis of the problem faced and how it can be reversed in search of social equality and better quality of life.Keywords: urban mobility, quality of life, social equality, substantiable
Procedia PDF Downloads 1863064 Modified Weibull Approach for Bridge Deterioration Modelling
Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight
Abstract:
State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models
Procedia PDF Downloads 730