Search results for: neural network models
6162 Exploring the Neural Mechanisms of Communication and Cooperation in Children and Adults
Authors: Sara Mosteller, Larissa K. Samuelson, Sobanawartiny Wijeakumar, John P. Spencer
Abstract:
This study was designed to examine how humans are able to teach and learn semantic information as well as cooperate in order to jointly achieve sophisticated goals. Specifically, we are measuring individual differences in how these abilities develop from foundational building blocks in early childhood. The current study adopts a paradigm for novel noun learning developed by Samuelson, Smith, Perry, and Spencer (2011) to a hyperscanning paradigm [Cui, Bryant and Reiss, 2012]. This project measures coordinated brain activity between a parent and child using simultaneous functional near infrared spectroscopy (fNIRS) in pairs of 2.5, 3.5 and 4.5-year-old children and their parents. We are also separately testing pairs of adult friends. Children and parents, or adult friends, are seated across from one another at a table. The parent (in the developmental study) then teaches their child the names of novel toys. An experimenter then tests the child by presenting the objects in pairs and asking the child to retrieve one object by name. Children are asked to choose from both pairs of familiar objects and pairs of novel objects. In order to explore individual differences in cooperation with the same participants, each dyad plays a cooperative game of Jenga, in which their joint score is based on how many blocks they can remove from the tower as a team. A preliminary analysis of the noun-learning task showed that, when presented with 6 word-object mappings, children learned an average of 3 new words (50%) and that the number of objects learned by each child ranged from 2-4. Adults initially learned all of the new words but were variable in their later retention of the mappings, which ranged from 50-100%. We are currently examining differences in cooperative behavior during the Jenga playing game, including time spent discussing each move before it is made. Ongoing analyses are examining the social dynamics that might underlie the differences between words that were successfully learned and unlearned words for each dyad, as well as the developmental differences observed in the study. Additionally, the Jenga game is being used to better understand individual and developmental differences in social coordination during a cooperative task. At a behavioral level, the analysis maps periods of joint visual attention between participants during the word learning and the Jenga game, using head-mounted eye trackers to assess each participant’s first-person viewpoint during the session. We are also analyzing the coherence in brain activity between participants during novel word-learning and Jenga playing. The first hypothesis is that visual joint attention during the session will be positively correlated with both the number of words learned and with the number of blocks moved during Jenga before the tower falls. The next hypothesis is that successful communication of new words and success in the game will each be positively correlated with synchronized brain activity between the parent and child/the adult friends in cortical regions underlying social cognition, semantic processing, and visual processing. This study probes both the neural and behavioral mechanisms of learning and cooperation in a naturalistic, interactive and developmental context.Keywords: communication, cooperation, development, interaction, neuroscience
Procedia PDF Downloads 2586161 Non-Invasive Characterization of the Mechanical Properties of Arterial Walls
Authors: Bruno RamaëL, GwenaëL Page, Catherine Knopf-Lenoir, Olivier Baledent, Anne-Virginie Salsac
Abstract:
No routine technique currently exists for clinicians to measure the mechanical properties of vascular walls non-invasively. Most of the data available in the literature come from traction or dilatation tests conducted ex vivo on native blood vessels. The objective of the study is to develop a non-invasive characterization technique based on Magnetic Resonance Imaging (MRI) measurements of the deformation of vascular walls under pulsating blood flow conditions. The goal is to determine the mechanical properties of the vessels by inverse analysis, coupling imaging measurements and numerical simulations of the fluid-structure interactions. The hyperelastic properties are identified using Solidworks and Ansys workbench (ANSYS Inc.) solving an optimization technique. The vessel of interest targeted in the study is the common carotid artery. In vivo MRI measurements of the vessel anatomy and inlet velocity profiles was acquired along the facial vascular network on a cohort of 30 healthy volunteers: - The time-evolution of the blood vessel contours and, thus, of the cross-section surface area was measured by 3D imaging angiography sequences of phase-contrast MRI. - The blood flow velocity was measured using a 2D CINE MRI phase contrast (PC-MRI) method. Reference arterial pressure waveforms were simultaneously measured in the brachial artery using a sphygmomanometer. The three-dimensional (3D) geometry of the arterial network was reconstructed by first creating an STL file from the raw MRI data using the open source imaging software ITK-SNAP. The resulting geometry was then transformed with Solidworks into volumes that are compatible with Ansys softwares. Tetrahedral meshes of the wall and fluid domains were built using the ANSYS Meshing software, with a near-wall mesh refinement method in the case of the fluid domain to improve the accuracy of the fluid flow calculations. Ansys Structural was used for the numerical simulation of the vessel deformation and Ansys CFX for the simulation of the blood flow. The fluid structure interaction simulations showed that the systolic and diastolic blood pressures of the common carotid artery could be taken as reference pressures to identify the mechanical properties of the different arteries of the network. The coefficients of the hyperelastic law were identified using Ansys Design model for the common carotid. Under large deformations, a stiffness of 800 kPa is measured, which is of the same order of magnitude as the Young modulus of collagen fibers. Areas of maximum deformations were highlighted near bifurcations. This study is a first step towards patient-specific characterization of the mechanical properties of the facial vessels. The method is currently applied on patients suffering from facial vascular malformations and on patients scheduled for facial reconstruction. Information on the blood flow velocity as well as on the vessel anatomy and deformability will be key to improve surgical planning in the case of such vascular pathologies.Keywords: identification, mechanical properties, arterial walls, MRI measurements, numerical simulations
Procedia PDF Downloads 3226160 Project Progress Prediction in Software Devlopment Integrating Time Prediction Algorithms and Large Language Modeling
Authors: Dong Wu, Michael Grenn
Abstract:
Managing software projects effectively is crucial for meeting deadlines, ensuring quality, and managing resources well. Traditional methods often struggle with predicting project timelines accurately due to uncertain schedules and complex data. This study addresses these challenges by combining time prediction algorithms with Large Language Models (LLMs). It makes use of real-world software project data to construct and validate a model. The model takes detailed project progress data such as task completion dynamic, team Interaction and development metrics as its input and outputs predictions of project timelines. To evaluate the effectiveness of this model, a comprehensive methodology is employed, involving simulations and practical applications in a variety of real-world software project scenarios. This multifaceted evaluation strategy is designed to validate the model's significant role in enhancing forecast accuracy and elevating overall management efficiency, particularly in complex software project environments. The results indicate that the integration of time prediction algorithms with LLMs has the potential to optimize software project progress management. These quantitative results suggest the effectiveness of the method in practical applications. In conclusion, this study demonstrates that integrating time prediction algorithms with LLMs can significantly improve the predictive accuracy and efficiency of software project management. This offers an advanced project management tool for the industry, with the potential to improve operational efficiency, optimize resource allocation, and ensure timely project completion.Keywords: software project management, time prediction algorithms, large language models (LLMS), forecast accuracy, project progress prediction
Procedia PDF Downloads 846159 Physics Informed Deep Residual Networks Based Type-A Aortic Dissection Prediction
Abstract:
Purpose: Acute Type A aortic dissection is a well-known cause of extremely high mortality rate. A highly accurate and cost-effective non-invasive predictor is critically needed so that the patient can be treated at earlier stage. Although various CFD approaches have been tried to establish some prediction frameworks, they are sensitive to uncertainty in both image segmentation and boundary conditions. Tedious pre-processing and demanding calibration procedures requirement further compound the issue, thus hampering their clinical applicability. Using the latest physics informed deep learning methods to establish an accurate and cost-effective predictor framework are amongst the main goals for a better Type A aortic dissection treatment. Methods: Via training a novel physics-informed deep residual network, with non-invasive 4D MRI displacement vectors as inputs, the trained model can cost-effectively calculate all these biomarkers: aortic blood pressure, WSS, and OSI, which are used to predict potential type A aortic dissection to avoid the high mortality events down the road. Results: The proposed deep learning method has been successfully trained and tested with both synthetic 3D aneurysm dataset and a clinical dataset in the aortic dissection context using Google colab environment. In both cases, the model has generated aortic blood pressure, WSS, and OSI results matching the expected patient’s health status. Conclusion: The proposed novel physics-informed deep residual network shows great potential to create a cost-effective, non-invasive predictor framework. Additional physics-based de-noising algorithm will be added to make the model more robust to clinical data noises. Further studies will be conducted in collaboration with big institutions such as Cleveland Clinic with more clinical samples to further improve the model’s clinical applicability.Keywords: type-a aortic dissection, deep residual networks, blood flow modeling, data-driven modeling, non-invasive diagnostics, deep learning, artificial intelligence.
Procedia PDF Downloads 946158 Discrete-Event Modeling and Simulation Methodologies: Past, Present and Future
Authors: Gabriel Wainer
Abstract:
Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems, and it is now common to use simulation as a part of the scientific and technological discovery process. M&S advanced thanks to the improvements in computer technology, which, in many cases, resulted in the development of simulation software using ad-hoc techniques. Formal M&S appeared in order to try to improve the development task of very complex simulation systems. Some of these techniques proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing the application development tasks; reducing costs and favoring reuse. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. DEVS model development is based on a sound theoretical framework. The independence of M&S tasks made possible to run DEVS models on different environments (personal computers, parallel computers, real-time equipment, and distributed simulators) and middleware. We will present a historical perspective of discrete-event M&S methodologies, showing different modeling techniques. We will introduce DEVS origins and general ideas, and compare it with some of these techniques. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability, and model-centered development techniques). We will show some examples of the current use of DEVS, including applications in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need for real-time modeling techniques, and our view in these fields.Keywords: modeling and simulation, discrete-event simulation, hybrid systems modeling, parallel and distributed simulation
Procedia PDF Downloads 3256157 Establishment and Validation of Correlation Equations to Estimate Volumetric Oxygen Mass Transfer Coefficient (KLa) from Process Parameters in Stirred-Tank Bioreactors Using Response Surface Methodology
Authors: Jantakan Jullawateelert, Korakod Haonoo, Sutipong Sananseang, Sarun Torpaiboon, Thanunthon Bowornsakulwong, Lalintip Hocharoen
Abstract:
Process scale-up is essential for the biological process to increase production capacity from bench-scale bioreactors to either pilot or commercial production. Scale-up based on constant volumetric oxygen mass transfer coefficient (KLa) is mostly used as a scale-up factor since oxygen supply is one of the key limiting factors for cell growth. However, to estimate KLa of culture vessels operated with different conditions are time-consuming since it is considerably influenced by a lot of factors. To overcome the issue, this study aimed to establish correlation equations of KLa and operating parameters in 0.5 L and 5 L bioreactor employed with pitched-blade impeller and gas sparger. Temperature, gas flow rate, agitation speed, and impeller position were selected as process parameters and equations were created using response surface methodology (RSM) based on central composite design (CCD). In addition, the effects of these parameters on KLa were also investigated. Based on RSM, second-order polynomial models for 0.5 L and 5 L bioreactor were obtained with an acceptable determination coefficient (R²) as 0.9736 and 0.9190, respectively. These models were validated, and experimental values showed differences less than 10% from the predicted values. Moreover, RSM revealed that gas flow rate is the most significant parameter while temperature and agitation speed were also found to greatly affect the KLa in both bioreactors. Nevertheless, impeller position was shown to influence KLa in only 5L system. To sum up, these modeled correlations can be used to accurately predict KLa within the specified range of process parameters of two different sizes of bioreactors for further scale-up application.Keywords: response surface methodology, scale-up, stirred-tank bioreactor, volumetric oxygen mass transfer coefficient
Procedia PDF Downloads 2116156 Transforming Maternity and Neonatal Services in a Middle Eastern Country
Authors: M. A. Brown, K. Hugill, D. Meredith
Abstract:
Since the establishment of midwifery, as a professional identity in its own right, in the early years of the 20th century, midwifery-led models of childbirth have prevailed in many parts of the world. However, in many locations midwives’ scope of practice remains underdeveloped or absent. In Qatar, all births take place in hospital and are under the professional jurisdiction of obstetricians, predominately supported by internationally trained nurse-midwives and obstetric nurses. The strategic vision for health services in Qatar endorsed a desire to provide women with the ‘Best Care Always’ and the introduction of midwifery was seen as a way to achieve this. In 2015 the process of recruiting postgraduate educated Clinical Midwife Specialists from international sources began. The midwives were brought together to initiate an in hospital and community service transformation plan. This plan set out a series of wide-ranging actions to transform maternity and neonatal services to make care safer and give women more health choices. Change in any organization is a complex and dynamic process. This is made even more complex when multifaceted professional and cross cultural factors are involved. This presentation reports upon the motivations and challenges that exist and the progress around introducing a multicultural midwifery model of childbirth care in the state of Qatar. The paper examines and reflects upon the drivers and unique features of childbirth in the country. Despite accomplishments, progress still needs to be made in order to fully implement sustainable changes to further improve care and ensure women and neonates get the ‘Best Care Always’. The progress within the transformation plan highlights how midwifery may coexist with competing models of maternity care to create an innovative, eclectic and culturally sensitive paradigm that can best serve women and neonatal health needs.Keywords: culture, managing change, midwifery, neonatal, service transformation plan
Procedia PDF Downloads 1506155 Modeling Socioeconomic and Political Dynamics of Terrorism in Pakistan
Authors: Syed Toqueer, Omer Younus
Abstract:
Terrorism, today, has emerged as a global menace with Pakistan being the most adversely affected state. Therefore, the motive behind this study is to empirically establish the linkage of terrorism with socio-economic (uneven income distribution, poverty and unemployment) and political nexuses so that a policy recommendation can be put forth to better approach this issue in Pakistan. For this purpose, the study employs two competing models, namely, the distributed lag model and OLS, so that findings of the model may be consolidated comprehensively, over the reference period of 1984-2012. The findings of both models are indicative of the fact that uneven income distribution of Pakistan is rather a contributing factor towards terrorism when measured through GDP per capita. This supports the hypothesis that immiserizing modernization theory is applicable for the state of Pakistan where the underprivileged are marginalized. Results also suggest that other socio-economic variables (poverty, unemployment and consumer confidence) can condense the brutality of terrorism once these conditions are catered to and improved. The rational of opportunity cost is at the base of this argument. Poor conditions of employment and poverty reduces the opportunity cost for individuals to be recruited by terrorist organizations as economic returns are considerably low and thus increasing the supply of volunteers and subsequently increasing the intensity of terrorism. The argument of political freedom as a means of lowering terrorism stands true. The more the people are politically repressed the more alternative and illegal means they will find to make their voice heard. Also, the argument that politically transitioning economy faces more terrorism is found applicable for Pakistan. Finally, the study contributes to an ongoing debate on which of the two set of factors are more significant with relation to terrorism by suggesting that socio-economic factors are found to be the primary causes of terrorism for Pakistan.Keywords: terrorism, socioeconomic conditions, political freedom, distributed lag model, ordinary least square
Procedia PDF Downloads 3266154 Crude Oil and Stocks Markets: Prices and Uncertainty Transmission Analysis
Authors: Kamel Malik Bensafta, Gervasio Semedo
Abstract:
The purpose of this paper is to investigate the relationship between oil prices and socks markets. The empirical analysis in this paper is conducted within the context of Multivariate GARCH models, using a transform version of the so-called BEKK parameterization. We show that mean and uncertainty of US market are transmitted to oil market and European market. We also identify an important transmission from WTI prices to Brent Prices.Keywords: oil volatility, stock markets, MGARCH, transmission, structural break
Procedia PDF Downloads 5286153 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data
Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos
Abstract:
Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia
Procedia PDF Downloads 326152 Fem Models of Glued Laminated Timber Beams Enhanced by Bayesian Updating of Elastic Moduli
Authors: L. Melzerová, T. Janda, M. Šejnoha, J. Šejnoha
Abstract:
Two finite element (FEM) models are presented in this paper to address the random nature of the response of glued timber structures made of wood segments with variable elastic moduli evaluated from 3600 indentation measurements. This total database served to create the same number of ensembles as was the number of segments in the tested beam. Statistics of these ensembles were then assigned to given segments of beams and the Latin Hypercube Sampling (LHS) method was called to perform 100 simulations resulting into the ensemble of 100 deflections subjected to statistical evaluation. Here, a detailed geometrical arrangement of individual segments in the laminated beam was considered in the construction of two-dimensional FEM model subjected to in four-point bending to comply with the laboratory tests. Since laboratory measurements of local elastic moduli may in general suffer from a significant experimental error, it appears advantageous to exploit the full scale measurements of timber beams, i.e. deflections, to improve their prior distributions with the help of the Bayesian statistical method. This, however, requires an efficient computational model when simulating the laboratory tests numerically. To this end, a simplified model based on Mindlin’s beam theory was established. The improved posterior distributions show that the most significant change of the Young’s modulus distribution takes place in laminae in the most strained zones, i.e. in the top and bottom layers within the beam center region. Posterior distributions of moduli of elasticity were subsequently utilized in the 2D FEM model and compared with the original simulations.Keywords: Bayesian inference, FEM, four point bending test, laminated timber, parameter estimation, prior and posterior distribution, Young’s modulus
Procedia PDF Downloads 2876151 Linguistic Misinterpretation and the Dialogue of Civilizations
Authors: Oleg Redkin, Olga Bernikova
Abstract:
Globalization and migrations have made cross-cultural contacts more frequent and intensive. Sometimes, these contacts may lead to misunderstanding between partners of communication and misinterpretations of the verbal messages that some researchers tend to consider as the 'clash of civilizations'. In most cases, reasons for that may be found in cultural and linguistic differences and hence misinterpretations of intentions and behavior. The current research examines factors of verbal and non-verbal communication that should be taken into consideration in verbal and non-verbal contacts. Language is one of the most important manifestations of the cultural code, and it is often considered as one of the special features of a civilization. The Arabic language, in particular, is commonly associated with Islam and the language and the Arab-Muslim civilization. It is one of the most important markers of self-identification for more than 200 million of native speakers. Arabic is the language of the Quran and hence the symbol of religious affiliation for more than one billion Muslims around the globe. Adequate interpretation of Arabic texts requires profound knowledge of its grammar, semantics of its vocabulary. Communicating sides who belong to different cultural groups are guided by different models of behavior and hierarchy of values, besides that the vocabulary each of them uses in the dialogue may convey different semantic realities and vary in connotations. In this context direct, literal translation in most cases cannot adequately convey the original meaning of the original message. Besides that peculiarities and diversities of the extralinguistic information, such as the body language, communicative etiquette, cultural background and religious affiliations may make the dialogue even more difficult. It is very likely that the so called 'clash of civilizations' in most cases is due to misinterpretation of counterpart's means of discourse such as language, cultural codes, and models of behavior rather than lies in basic contradictions between partners of communication. In the process of communication, one has to rely on universal values rather than focus on cultural or religious peculiarities, to take into account current linguistic and extralinguistic context.Keywords: Arabic, civilization, discourse, language, linguistic
Procedia PDF Downloads 2266150 Development of an Improved Paradigm for the Tourism Sector in the Department of Huila, Colombia: A Theoretical and Empirical Approach
Authors: Laura N. Bolivar T.
Abstract:
The tourism importance for regional development is mainly highlighted by the collaborative, cooperating and competitive relationships of the involved agents. The fostering of associativity processes, in particular, the cluster approach emphasizes the beneficial outcomes from the concentration of enterprises, where innovation and entrepreneurship flourish and shape the dynamics for tourism empowerment. Considering the department of Huila, it is located in the south-west of Colombia and holds the biggest coffee production in the country, although it barely contributes to the national GDP. Hence, its economic development strategy is looking for more dynamism and Huila could be consolidated as a leading destination for cultural, ecological and heritage tourism, if at least the public policy making processes for the tourism management of La Tatacoa Desert, San Agustin Park and Bambuco’s National Festival, were implemented in a more efficient manner. In this order of ideas, this study attempts to address the potential restrictions and beneficial factors for the consolidation of the tourism sector of Huila-Colombia as a cluster and how could it impact its regional development. Therefore, a set of theoretical frameworks such as the Tourism Routes Approach, the Tourism Breeding Environment, the Community-based Tourism Method, among others, but also a collection of international experiences describing tourism clustering processes and most outstanding problematics, is analyzed to draw up learning points, structure of proceedings and success-driven factors to be contrasted with the local characteristics in Huila, as the region under study. This characterization involves primary and secondary information collection methods and comprises the South American and Colombian context together with the identification of involved actors and their roles, main interactions among them, major tourism products and their infrastructure, the visitors’ perspective on the situation and a recap of the related needs and benefits regarding the host community. Considering the umbrella concepts, the theoretical and the empirical approaches, and their comparison with the local specificities of the tourism sector in Huila, an array of shortcomings is analytically constructed and a series of guidelines are proposed as a way to overcome them and simultaneously, raise economic development and positively impact Huila’s well-being. This non-exhaustive bundle of guidelines is focused on fostering cooperating linkages in the actors’ network, dealing with Information and Communication Technologies’ innovations, reinforcing the supporting infrastructure, promoting the destinations considering the less known places as well, designing an information system enabling the tourism network to assess the situation based on reliable data, increasing competitiveness, developing participative public policy-making processes and empowering the host community about the touristic richness. According to this, cluster dynamics would drive the tourism sector to meet articulation and joint effort, then involved agents and local particularities would be adequately assisted to cope with the current changing environment of globalization and competition.Keywords: innovative strategy, local development, network of tourism actors, tourism cluster
Procedia PDF Downloads 1456149 Building Information Modeling Implementation for Managing an Extra Large Governmental Building Renovation Project
Authors: Pornpote Nusen, Manop Kaewmoracharoen
Abstract:
In recent years, there was an observable shift in fully developed countries from constructing new buildings to modifying existing buildings. The issue was that although an effective instrument like BIM (Building Information Modeling) was well developed for constructing new buildings, it was not widely used to renovate old buildings. BIM was accepted as an effective means to overcome common managerial problems such as project delay, cost overrun, and poor quality of the project life cycle. It was recently introduced in Thailand and rarely used in a renovation project. Today, in Thailand, BIM is mostly used for creating aesthetic 3D models and quantity takeoff purposes, though it can be an effective tool to use as a project management tool in planning and scheduling. Now the governmental sector in Thailand begins to recognize the uses of using BIM to manage a construction project, but the knowledge about the BIM implementation to governmental construction projects is underdeveloped. Further studies need to be conducted to maximize its advantages for the governmental sector. An educational extra large governmental building of 17,000 square-meters was used in this research. It is currently under construction for a two-year renovation project. BIM models of the building for the exterior and interior areas were created for the whole five floors. Then 4D BIM with combination of 3D BIM plus time was created for planning and scheduling. Three focus groups had been done with executive committee, contractors, and officers of the building to discuss the possibility of usage and usefulness of BIM approach over the traditional process. Several aspects were discussed in the positive sides, especially several foreseen problems, such as the inadequate accessibility of ways, the altered ceiling levels, the impractical construction plan created through a traditional approach, and the lack of constructability information. However, for some parties, the cost of BIM implementation was a concern, though, this study believes, its uses outweigh the cost.Keywords: building information modeling, extra large building, governmental building renovation, project management, renovation, 4D BIM
Procedia PDF Downloads 1566148 In Silico Exploration of Quinazoline Derivatives as EGFR Inhibitors for Lung Cancer: A Multi-Modal Approach Integrating QSAR-3D, ADMET, Molecular Docking, and Molecular Dynamics Analyses
Authors: Mohamed Moussaoui
Abstract:
A series of thirty-one potential inhibitors targeting the epidermal growth factor receptor kinase (EGFR), derived from quinazoline, underwent 3D-QSAR analysis using CoMFA and CoMSIA methodologies. The training and test sets of quinazoline derivatives were utilized to construct and validate the QSAR models, respectively, with dataset alignment performed using the lowest energy conformer of the most active compound. The best-performing CoMFA and CoMSIA models demonstrated impressive determination coefficients, with R² values of 0.981 and 0.978, respectively, and Leave One Out cross-validation determination coefficients, Q², of 0.645 and 0.729, respectively. Furthermore, external validation using a test set of five compounds yielded predicted determination coefficients, R² test, of 0.929 and 0.909 for CoMFA and CoMSIA, respectively. Building upon these promising results, eighteen new compounds were designed and assessed for drug likeness and ADMET properties through in silico methods. Additionally, molecular docking studies were conducted to elucidate the binding interactions between the selected compounds and the enzyme. Detailed molecular dynamics simulations were performed to analyze the stability, conformational changes, and binding interactions of the quinazoline derivatives with the EGFR kinase. These simulations provided deeper insights into the dynamic behavior of the compounds within the active site. This comprehensive analysis enhances the understanding of quinazoline derivatives as potential anti-cancer agents and provides valuable insights for lead optimization in the early stages of drug discovery, particularly for developing highly potent anticancer therapeuticsKeywords: 3D-QSAR, CoMFA, CoMSIA, ADMET, molecular docking, quinazoline, molecular dynamic, egfr inhibitors, lung cancer, anticancer
Procedia PDF Downloads 536147 Underground Coal Gasification Technology in Türkiye: A Techno-Economic Assessment
Authors: Fatma Ünal, Hasancan Okutan
Abstract:
Increasing worldwide population and technological requirements lead to an increase in energy demand every year. The demand has been mainly supplied from fossil fuels such as coal and petroleum due to insufficient natural gas resources. In recent years, the amount of coal reserves has reached almost 21 billion tons in Türkiye. These are mostly lignite (%92,7), that contains high levels of moisture and sulfur components. Underground coal gasification technology is one of the most suitable methods in comparison with direct combustion techniques for the evaluation of such coal types. In this study, the applicability of the underground coal gasification process is investigated in the Eskişehir-Alpu lignite reserve as a pilot region, both technologically and economically. It is assumed that the electricity is produced from the obtained synthesis gas in an integrated gasification combined cycle (IGCC). Firstly, an equilibrium model has been developed by using the thermodynamic properties of the gasification reactions. The effect of the type of oxidizing gas, the sulfur content of coal, the rate of water vapor/air, and the pressure of the system have been investigated to find optimum process conditions. Secondly, the parallel and linear controlled recreation and injection point (CRIP) models were implemented as drilling methods, and costs were calculated under the different oxidizing agents (air and high-purity O2). In Parallel CRIP (P-CRIP), drilling cost is found to be lower than the linear CRIP (L-CRIP) since two coal beds simultaneously are gasified. It is seen that CO2 Capture and Storage (CCS) technology was the most effective unit on the total cost in both models. The cost of the synthesis gas produced varies between 0,02 $/Mcal and 0,09 $/Mcal. This is the promising result when considering the selling price of Türkiye natural gas for Q1-2023 (0.103 $ /Mcal).Keywords: energy, lignite reserve, techno-economic analysis, underground coal gasification.
Procedia PDF Downloads 706146 Gas Flow, Time, Distance Dynamic Modelling
Authors: A. Abdul-Ameer
Abstract:
The equations governing the distance, pressure- volume flow relationships for the pipeline transportation of gaseous mixtures, are considered. A derivation based on differential calculus, for an element of this system model, is addressed. Solutions, yielding the input- output response following pressure changes, are reviewed. The technical problems associated with these analytical results are identified. Procedures resolving these difficulties providing thereby an attractive, simple, analysis route are outlined. Computed responses, validating thereby calculated predictions, are presented.Keywords: pressure, distance, flow, dissipation, models
Procedia PDF Downloads 4796145 Development of a Fuzzy Logic Based Model for Monitoring Child Pornography
Authors: Mariam Ismail, Kazeem Rufai, Jeremiah Balogun
Abstract:
A study was conducted to apply fuzzy logic to the development of a monitoring model for child pornography based on associated risk factors, which can be used by forensic experts or integrated into forensic systems for the early detection of child pornographic activities. A number of methods were adopted in the study, which includes an extensive review of related works was done in order to identify the factors that are associated with child pornography following which they were validated by an expert sex psychologist and guidance counselor, and relevant data was collected. Fuzzy membership functions were used to fuzzify the associated variables identified alongside the risk of the occurrence of child pornography based on the inference rules that were provided by the experts consulted, and the fuzzy logic expert system was simulated using the Fuzzy Logic Toolbox available in the MATLAB Software Release 2016. The results of the study showed that there were 4 categories of risk factors required for assessing the risk of a suspect committing child pornography offenses. The results of the study showed that 2 and 3 triangular membership functions were used to formulate the risk factors based on the 2 and 3 number of labels assigned, respectively. The results of the study showed that 5 fuzzy logic models were formulated such that the first 4 was used to assess the impact of each category on child pornography while the last one takes the 4 outputs from the 4 fuzzy logic models as inputs required for assessing the risk of child pornography. The following conclusion was made; there were factors that were related to personal traits, social traits, history of child pornography crimes, and self-regulatory deficiency traits by the suspects required for the assessment of the risk of child pornography crimes committed by a suspect. Using the values of the identified risk factors selected for this study, the risk of child pornography can be easily assessed from their values in order to determine the likelihood of a suspect perpetuating the crime.Keywords: fuzzy, membership functions, pornography, risk factors
Procedia PDF Downloads 1356144 Behavior of Common Philippine-Made Concrete Hollow Block Structures Subjected to Seismic Load Using Rigid Body Spring-Discrete Element Method
Authors: Arwin Malabanan, Carl Chester Ragudo, Jerome Tadiosa, John Dee Mangoba, Eric Augustus Tingatinga, Romeo Eliezer Longalong
Abstract:
Concrete hollow blocks (CHB) are the most commonly used masonry block for walls in residential houses, school buildings and public buildings in the Philippines. During the recent 2013 Bohol earthquake (Mw 7.2), it has been proven that CHB walls are very vulnerable to severe external action like strong ground motion. In this paper, a numerical model of CHB structures is proposed, and seismic behavior of CHB houses is presented. In modeling, the Rigid Body Spring-Discrete Element method (RBS-DEM)) is used wherein masonry blocks are discretized into rigid elements and connected by nonlinear springs at preselected contact points. The shear and normal stiffness of springs are derived from the material properties of CHB unit incorporating the grout and mortar fillings through the volumetric transformation of the dimension using material ratio. Numerical models of reinforced and unreinforced walls are first subjected to linearly-increasing in plane loading to observe the different failure mechanisms. These wall models are then assembled to form typical model masonry houses and then subjected to the El Centro and Pacoima earthquake records. Numerical simulations show that the elastic, failure and collapse behavior of the model houses agree well with shaking table tests results. The effectiveness of the method in replicating failure patterns will serve as a basis for the improvement of the design and provides a good basis of strengthening the structure.Keywords: concrete hollow blocks, discrete element method, earthquake, rigid body spring model
Procedia PDF Downloads 3786143 Optical Character Recognition of Handwritten Hebrew Documents
Authors: Tomer Kakou, Tal BoAhron, Natalia Vanetik
Abstract:
As digital transformation accelerates, the demand for processing handwritten text images has significantly increased. The ability to convert handwritten text into a computer-readable format is crucial for enabling efficient searching, storage, editing, and interpretation, even for challenging handwriting. Organizations that need to accurately and efficiently digitize handwritten records, like educational institutions, would find this capacity very useful. Even while optical character recognition (OCR) for printed text has advanced, handwritten writing has additional difficulties that are especially challenging in low-resource languages like Hebrew. To bridge this gap, are developing an innovative method for Hebrew handwritten OCR that leverages both traditional and cutting-edge techniques. it approach integrates a newly curated dataset of handwritten Hebrew text images with an existing dataset of Hebrew texts called HDD for more precise image classification. The core of our methodology involves a multi-step process that first enhances image resolution to improve overall quality, followed by the extraction of individual character images using advanced image processing tools like OpenCV. Each character image is then classified into one of 27 classes, corresponding to the letters of the Hebrew alphabet. This step is crucial, as it enables the system to recognize individual characters, which are then reassembled into coherent text sequences. To achieve accurate recognition, it utilize deep learning models, including Vision Transformer (ViT) and ResNet-50, for multi-class image classification. These models have shown promising results in the domain of visual recognition tasks, and their adaptation to handwritten Hebrew text offers significant potential for improving OCR performance. Context-based word recognition will be used in the next stage, when large language models (LLMs) are used to provide contextual corrections. This increases the output's overall accuracy by resolving errors and ambiguities that occur during the character recognition process. For model evaluation, we employ several performance metrics, including Character Error Rate (CER), Word Error Rate (WER), and Normalized Levenshtein Distance (NLD). NLD has proven to be the most reliable metric in our case, as it accounts for small errors and typographical variations, making it particularly suited for evaluating OCR. This project's ultimate objective is to create a reliable, comprehensive end-to-end solution for handwritten Hebrew text digitization that may be used in a variety of contexts. Additionally, our method strives to achieve high accuracy even in situations with handwriting errors or deteriorated text by incorporating context-based adjustments, which makes it a useful tool for real-world applications.Keywords: hebrew, image classification, low-resource languages, OCR
Procedia PDF Downloads 06142 Strength Evaluation by Finite Element Analysis of Mesoscale Concrete Models Developed from CT Scan Images of Concrete Cube
Authors: Nirjhar Dhang, S. Vinay Kumar
Abstract:
Concrete is a non-homogeneous mix of coarse aggregates, sand, cement, air-voids and interfacial transition zone (ITZ) around aggregates. Adoption of these complex structures and material properties in numerical simulation would lead us to better understanding and design of concrete. In this work, the mesoscale model of concrete has been prepared from X-ray computerized tomography (CT) image. These images are converted into computer model and numerically simulated using commercially available finite element software. The mesoscale models are simulated under the influence of compressive displacement. The effect of shape and distribution of aggregates, continuous and discrete ITZ thickness, voids, and variation of mortar strength has been investigated. The CT scan of concrete cube consists of series of two dimensional slices. Total 49 slices are obtained from a cube of 150mm and the interval of slices comes approximately 3mm. In CT scan images, the same cube can be CT scanned in a non-destructive manner and later the compression test can be carried out in a universal testing machine (UTM) for finding its strength. The image processing and extraction of mortar and aggregates from CT scan slices are performed by programming in Python. The digital colour image consists of red, green and blue (RGB) pixels. The conversion of RGB image to black and white image (BW) is carried out, and identification of mesoscale constituents is made by putting value between 0-255. The pixel matrix is created for modeling of mortar, aggregates, and ITZ. Pixels are normalized to 0-9 scale considering the relative strength. Here, zero is assigned to voids, 4-6 for mortar and 7-9 for aggregates. The value between 1-3 identifies boundary between aggregates and mortar. In the next step, triangular and quadrilateral elements for plane stress and plane strain models are generated depending on option given. Properties of materials, boundary conditions, and analysis scheme are specified in this module. The responses like displacement, stresses, and damages are evaluated by ABAQUS importing the input file. This simulation evaluates compressive strengths of 49 slices of the cube. The model is meshed with more than sixty thousand elements. The effect of shape and distribution of aggregates, inclusion of voids and variation of thickness of ITZ layer with relation to load carrying capacity, stress-strain response and strain localizations of concrete have been studied. The plane strain condition carried more load than plane stress condition due to confinement. The CT scan technique can be used to get slices from concrete cores taken from the actual structure, and the digital image processing can be used for finding the shape and contents of aggregates in concrete. This may be further compared with test results of concrete cores and can be used as an important tool for strength evaluation of concrete.Keywords: concrete, image processing, plane strain, interfacial transition zone
Procedia PDF Downloads 2436141 Improving the Training for Civil Engineers by Introducing Virtual Reality Technique
Authors: Manar Al-Ateeq
Abstract:
The building construction industry plays a major role in the economy of the word and the state of Kuwait. This paper evaluates existing new civil site engineers, describes a new system for improvement and insures the importance of prequalifying and developing for new engineers. In order to have a strong base in engineering, educational institutes and workplaces should be responsible to continuously train engineers and update them with new methods and techniques in engineering. As to achieve that, school of engineering should constantly update computational resources to be used in the professions. A survey was prepared for graduated Engineers based on stated objectives to understand the status of graduate engineers in both the public and private sector. Interviews were made with different sectors in Kuwait, and several visits were made to different training centers within different workplaces in Kuwait to evaluate training process and try to improve it. Virtual Reality (VR) technology could be applied as a complement to three-dimensional (3D) modeling, leading to better communication whether in job training, in education or in professional practice. Techniques of 3D modeling and VR can be applied to develop the models related to the construction process. The 3D models can support rehabilitation design as it can be considered as a great tool for monitoring failure and defaults in structures; also it can support decisions based on the visual analyses of alternative solutions. Therefore, teaching computer-aided design (CAD) and VR techniques in school will help engineering students in order to prepare them to site work and also will assist them to consider these technologies as important supports in their later professional practice. This teaching technique will show how the construction works developed, allow the visual simulation of progression of each type of work and help them to know more about the necessary equipment needed for tasks and how it works on site.Keywords: three dimensional modeling (3DM), civil engineers (CE), professional practice (PP), virtual reality (VR)
Procedia PDF Downloads 1816140 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran
Authors: Fatemeh Faramarzi, Hosein Mahjoob
Abstract:
Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6
Procedia PDF Downloads 3146139 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices
Authors: Fatemeh Abbasi, Sahand Daneshvar
Abstract:
Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index
Procedia PDF Downloads 1976138 The Type II Immune Response in Acute and Chronic Pancreatitis Mediated by STAT6 in Murine
Authors: Hager Elsheikh
Abstract:
Context: Pancreatitis is a condition characterized by inflammation in the pancreas, which can lead to serious complications if untreated. Both acute and chronic pancreatitis are associated with immune reactions and fibrosis, which further damage the pancreas. The type 2 immune response, primarily driven by alternative activated macrophages (AAMs), plays a significant role in the development of fibrosis. The IL-4/STAT6 pathway is a crucial signaling pathway for the activation of M2 macrophages. Pancreatic fibrosis is induced by dysregulated inflammatory responses and can result in the autodigestion and necrosis of pancreatic acinar cells. Research Aim: The aim of this study is to investigate the impact of STAT6, a crucial molecule in the IL-4/STAT6 pathway, on the severity and development of fibrosis during acute and chronic pancreatitis. The research also aims to understand the influence of the JAK/STAT6 signaling pathway on the balance between fibrosis and regeneration in the presence of different macrophage populations. Methodology: The research utilizes murine models of acute and chronic pancreatitis induced by cerulean injection. Animal models will be employed to study the effect of STAT6 knockout on disease severity and fibrosis. Isolation of acinar cells and cell culture techniques will be used to assess the impact of different macrophage populations on wound healing and regeneration. Various techniques such as PCR, histology, immunofluorescence, and transcriptomics will be employed to analyze the tissues and cells. Findings: The research aims to provide insights into the mechanisms underlying tissue fibrosis and wound healing during acute and chronic pancreatitis. By investigating the influence of the JAK/STAT6 signaling pathway and different macrophage populations, the study aims to understand their impact on tissue fibrosis, disease severity, and pancreatic regeneration. Theoretical Importance: This research contributes to our understanding of the role of specific signaling pathways, macrophage polarization, and the type 2 immune response in pancreatitis. It provides insights into the molecular mechanisms underlying tissue fibrosis and the potential for targeted therapies. Data Collection and Analysis Procedures: Data will be collected through the use of murine models, isolation and culture of acinar cells, and various experimental techniques such as PCR, histology, immunofluorescence, and transcriptomics. Data will be analyzed using appropriate statistical methods and techniques, and the findings will be interpreted in the context of the research objectives. Conclusion: By investigating the mechanisms of tissue fibrosis and wound healing during acute and chronic pancreatitis, this research aims to enhance our understanding of the disease progression and potential therapeutic targets. The findings have theoretical importance in expanding our knowledge of pancreatic fibrosis and the role of macrophage polarization in the context of the type 2 immune response.Keywords: immunity in chronic diseases, pancreatitis, macrophages, immune response
Procedia PDF Downloads 416137 On Elastic Anisotropy of Fused Filament Fabricated Acrylonitrile Butadiene Styrene Structures
Authors: Joseph Marae Djouda, Ashraf Kasmi, François Hild
Abstract:
Fused filament fabrication is one of the most widespread additive manufacturing techniques because of its low-cost implementation. Its initial development was based on part fabrication with thermoplastic materials. The influence of the manufacturing parameters such as the filament orientation through the nozzle, the deposited layer thickness, or the speed deposition on the mechanical properties of the parts has been widely experimentally investigated. It has been recorded the remarkable variations of the anisotropy in the function of the filament path during the fabrication process. However, there is a lack in the development of constitutive models describing the mechanical properties. In this study, integrated digital image correlation (I-DIC) is used for the identification of mechanical constitutive parameters of two configurations of ABS samples: +/-45° and so-called “oriented deposition.” In this last, the filament was deposited in order to follow the principal strain of the sample. The identification scheme based on the gap reduction between simulation and the experiment directly from images recorded from a single sample (single edge notched tension specimen) is developed. The macroscopic and mesoscopic analysis are conducted from images recorded in both sample surfaces during the tensile test. The elastic and elastoplastic models in isotropic and orthotropic frameworks have been established. It appears that independently of the sample configurations (filament orientation during the fabrication), the elastoplastic isotropic model gives the correct description of the behavior of samples. It is worth noting that in this model, the number of constitutive parameters is limited to the one considered in the elastoplastic orthotropic model. This leads to the fact that the anisotropy of the architectured 3D printed ABS parts can be neglected in the establishment of the macroscopic behavior description.Keywords: elastic anisotropy, fused filament fabrication, Acrylonitrile butadiene styrene, I-DIC identification
Procedia PDF Downloads 1336136 Diversity in the Community - The Disability Perspective
Authors: Sarah Reker, Christiane H. Kellner
Abstract:
From the perspective of people with disabilities, inequalities can also emerge from spatial segregation, the lack of social contacts or limited economic resources. In order to reduce or even eliminate these disadvantages and increase general well-being, community-based participation as well as decentralisation efforts within exclusively residential homes is essential. Therefore, the new research project “Index for participation development and quality of life for persons with disabilities”(TeLe-Index, 2014-2016), which is anchored at the Technische Universität München in Munich and at a large residential complex and service provider for persons with disabilities in the outskirts of Munich aims to assist the development of community-based living environments. People with disabilities should be able to participate in social life beyond the confines of the institution. Since a diverse society is a society in which different individual needs and wishes can emerge and be catered to, the ultimate goal of the project is to create an environment for all citizens–regardless of disability, age or ethnic background–that accommodates their daily activities and requirements. The UN-Convention on the Rights of Persons with Disabilities, which Germany also ratified, postulates the necessity of user-centered design, especially when it comes to evaluating the individual needs and wishes of all citizens. Therefore, a multidimensional approach is required. Based on this insight, the structure of the town-like center will be remodeled to open up the community to all people. This strategy should lead to more equal opportunities and open the way for a much more diverse community. Therefore, macro-level research questions were inspired by quality of life theory and were formulated as follows for different dimensions: •The user dimension: what needs and necessities can we identify? Are needs person-related? Are there any options to choose from? What type of quality of life can we identify? The economic dimension: what resources (both material and staff-related) are available in the region? (How) are they used? What costs (can) arise and what effects do they entail? •The environment dimension: what “environmental factors” such as access (mobility and absence of barriers) prove beneficial or impedimental? In this context, we have provided academic supervision and support for three projects (the construction of a new school, inclusive housing for children and teenagers with disabilities and the professionalization of employees with person-centered thinking). Since we cannot present all the issues of the umbrella-project within the conference framework, we will be focusing on one project more in-depth, namely “Outpatient Housing Options for Children and Teenagers with Disabilities”. The insights we have obtained until now will enable us to present the intermediary results of our evaluation. The most central questions pertaining to this part of the research were the following: •How have the existing network relations been designed? •What meaning (or significance) does the existing service offers and structures have for the everyday life of an external residential group? These issues underpinned the environmental analyses as well as the qualitative guided interviews and qualitative network analyses we carried out.Keywords: decentralisation, environmental analyses, outpatient housing options for children and teenagers with disabilities, qualitative network analyses
Procedia PDF Downloads 3676135 The Novelty of Mobile Money Solution to Ghana’S Cashless Future: Opportunities, Challenges and Way Forward
Authors: Julius Y Asamoah
Abstract:
Mobile money has seen faster adoption in the decade. Its emergence serves as an essential driver of financial inclusion and an innovative financial service delivery channel, especially to the unbanked population. The rising importance of mobile money services has caught policymakers and regulators' attention, seeking to understand the many issues emerging from this context. At the same time, it is unlocking the potential of knowledge of this new technology. Regulatory responses and support are essential, requiring significant changes to current regulatory practices in Ghana. The article aims to answer the following research questions: "What risk does an unregulated mobile money service pose to consumers and the financial system? "What factors stimulate and hinder the introduction of mobile payments in developing countries? The sample size used was 250 respondents selected from the study area. The study has adopted an analytical approach comprising a combination of qualitative and quantitative data collection methods. Actor-network theory (ANT) is used as an interpretive lens to analyse this process. ANT helps analyse how actors form alliances and enrol other actors, including non-human actors (i.e. technology), to secure their interests. The study revealed that government regulatory policies impact mobile money as critical to mobile money services in developing countries. Regulatory environment should balance the needs of advancing access to finance with the financial system's stability and draw extensively from Kenya's work as the best strategies for the system's players. Thus, regulators need to address issues related to the enhancement of supportive regulatory frameworks. It recommended that the government involve various stakeholders, such as mobile phone operators. Moreover, the national regulatory authority creates a regulatory environment that promotes fair practices and competition to raise revenues to support a business-enabling environment's key pillars as infrastructure.Keywords: actor-network theory (ANT), cashless future, Developing countries, Ghana, Mobile Money
Procedia PDF Downloads 1416134 Virtual Science Hub: An Open Source Platform to Enrich Science Teaching
Authors: Enrique Barra, Aldo Gordillo, Juan Quemada
Abstract:
This paper presents the Virtual Science Hub platform. It is an open source platform that combines a social network, an e-learning authoring tool, a video conference service and a learning object repository for science teaching enrichment. These four main functionalities fit very well together. The platform was released in April 2012 and since then it has not stopped growing. Finally we present the results of the surveys conducted and the statistics gathered to validate this approach.Keywords: e-learning, platform, authoring tool, science teaching, educational sciences
Procedia PDF Downloads 4026133 Application of Argumentation for Improving the Classification Accuracy in Inductive Concept Formation
Authors: Vadim Vagin, Marina Fomina, Oleg Morosin
Abstract:
This paper contains the description of argumentation approach for the problem of inductive concept formation. It is proposed to use argumentation, based on defeasible reasoning with justification degrees, to improve the quality of classification models, obtained by generalization algorithms. The experiment’s results on both clear and noisy data are also presented.Keywords: argumentation, justification degrees, inductive concept formation, noise, generalization
Procedia PDF Downloads 449