Search results for: data driven and knowledge driven
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29203

Search results for: data driven and knowledge driven

29023 Software Assessment Using Ant Colony Optimization Algorithm

Authors: Saad M. Darwish

Abstract:

Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However,these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.

Keywords: optimization technique, quality assurance, software certification model, software assessment

Procedia PDF Downloads 459
29022 An Exploratory Sequential Design: A Mixed Methods Model for the Statistics Learning Assessment with a Bayesian Network Representation

Authors: Zhidong Zhang

Abstract:

This study established a mixed method model in assessing statistics learning with Bayesian network models. There are three variants in exploratory sequential designs. There are three linked steps in one of the designs: qualitative data collection and analysis, quantitative measure, instrument, intervention, and quantitative data collection analysis. The study used a scoring model of analysis of variance (ANOVA) as a content domain. The research study is to examine students’ learning in both semantic and performance aspects at fine grain level. The ANOVA score model, y = α+ βx1 + γx1+ ε, as a cognitive task to collect data during the student learning process. When the learning processes were decomposed into multiple steps in both semantic and performance aspects, a hierarchical Bayesian network was established. This is a theory-driven process. The hierarchical structure was gained based on qualitative cognitive analysis. The data from students’ ANOVA score model learning was used to give evidence to the hierarchical Bayesian network model from the evidential variables. Finally, the assessment results of students’ ANOVA score model learning were reported. Briefly, this was a mixed method research design applied to statistics learning assessment. The mixed methods designs expanded more possibilities for researchers to establish advanced quantitative models initially with a theory-driven qualitative mode.

Keywords: exploratory sequential design, ANOVA score model, Bayesian network model, mixed methods research design, cognitive analysis

Procedia PDF Downloads 131
29021 Familial Exome Sequencing to Decipher the Complex Genetic Basis of Holoprosencephaly

Authors: Artem Kim, Clara Savary, Christele Dubourg, Wilfrid Carre, Houda Hamdi-Roze, Valerie Dupé, Sylvie Odent, Marie De Tayrac, Veronique David

Abstract:

Holoprosencephaly (HPE) is a rare congenital brain malformation resulting from the incomplete separation of the two cerebral hemispheres. It is characterized by a wide phenotypic spectrum and a high degree of locus heterogeneity. Genetic defects in 16 genes have already been implicated in HPE, but account for only 30% of cases, suggesting that a large part of genetic factors remains to be discovered. HPE has been recently redefined as a complex multigenic disorder, requiring the joint effect of multiple mutational events in genes belonging to one or several developmental pathways. The onset of HPE may result from accumulation of the effects of multiple rare variants in functionally-related genes, each conferring a moderate increase in the risk of HPE onset. In order to decipher the genetic basis of HPE, unconventional patterns of inheritance involving multiple genetic factors need to be considered. The primary objective of this study was to uncover possible disease causing combinations of multiple rare variants underlying HPE by performing trio-based Whole Exome Sequencing (WES) of familial cases where no molecular diagnosis could be established. 39 families were selected with no fully-penetrant causal mutation in known HPE gene, no chromosomic aberrations/copy number variants and without any implication of environmental factors. As the main challenge was to identify disease-related variants among a large number of nonpathogenic polymorphisms detected by WES classical scheme, a novel variant prioritization approach was established. It combined WES filtering with complementary gene-level approaches: transcriptome-driven (RNA-Seq data) and clinically-driven (public clinical data) strategies. Briefly, a filtering approach was performed to select variants compatible with disease segregation, population frequency and pathogenicity prediction to identify an exhaustive list of rare deleterious variants. The exome search space was then reduced by restricting the analysis to candidate genes identified by either transcriptome-driven strategy (genes sharing highly similar expression patterns with known HPE genes during cerebral development) or clinically-driven strategy (genes associated to phenotypes of interest overlapping with HPE). Deeper analyses of candidate variants were then performed on a family-by-family basis. These included the exploration of clinical information, expression studies, variant characteristics, recurrence of mutated genes and available biological knowledge. A novel bioinformatics pipeline was designed. Applied to the 39 families, this final integrated workflow identified an average of 11 candidate variants per family. Most of candidate variants were inherited from asymptomatic parents suggesting a multigenic inheritance pattern requiring the association of multiple mutational events. The manual analysis highlighted 5 new strong HPE candidate genes showing recurrences in distinct families. Functional validations of these genes are foreseen.

Keywords: complex genetic disorder, holoprosencephaly, multiple rare variants, whole exome sequencing

Procedia PDF Downloads 173
29020 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 302
29019 Evolving Software Assessment and Certification Models Using Ant Colony Optimization Algorithm

Authors: Saad M. Darwish

Abstract:

Recently, software quality issues have come to be seen as important subject as we see an enormous growth of agencies involved in software industries. However, these agencies cannot guarantee the quality of their products, thus leaving users in uncertainties. Software certification is the extension of quality by means that quality needs to be measured prior to certification granting process. This research participates in solving the problem of software assessment by proposing a model for assessment and certification of software product that uses a fuzzy inference engine to integrate both of process–driven and application-driven quality assurance strategies. The key idea of the on hand model is to improve the compactness and the interpretability of the model’s fuzzy rules via employing an ant colony optimization algorithm (ACO), which tries to find good rules description by dint of compound rules initially expressed with traditional single rules. The model has been tested by case study and the results have demonstrated feasibility and practicability of the model in a real environment.

Keywords: software quality, quality assurance, software certification model, software assessment

Procedia PDF Downloads 492
29018 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 358
29017 The Implementation of a Nurse-Driven Palliative Care Trigger Tool

Authors: Sawyer Spurry

Abstract:

Problem: Palliative care providers at an academic medical center in Maryland stated medical intensive care unit (MICU) patients are often referred late in their hospital stay. The MICU has performed well below the hospital quality performance metric of 80% of patients who expire with expected outcomes should have received a palliative care consult within 48 hours of admission. Purpose: The purpose of this quality improvement (QI) project is to increase palliative care utilization in the MICU through the implementation of a Nurse-Driven PalliativeTriggerTool to prompt the need for specialty palliative care consult. Methods: MICU nursing staff and providers received education concerning the implications of underused palliative care services and the literature data supporting the use of nurse-driven palliative care tools as a means of increasing utilization of palliative care. A MICU population specific criteria of palliative triggers (Palliative Care Trigger Tool) was formulated by the QI implementation team, palliative care team, and patient care services department. Nursing staff were asked to assess patients daily for the presence of palliative triggers using the Palliative Care Trigger Tool and present findings during bedside rounds. MICU providers were asked to consult palliative medicinegiven the presence of palliative triggers; following interdisciplinary rounds. Rates of palliative consult, given the presence of triggers, were collected via electronic medical record e-data pull, de-identified, and recorded in the data collection tool. Preliminary Results: Over 140 MICU registered nurses were educated on the palliative trigger initiative along with 8 nurse practitioners, 4 intensivists, 2 pulmonary critical care fellows, and 2 palliative medicine physicians. Over 200 patients were admitted to the MICU and screened for palliative triggers during the 15-week implementation period. Primary outcomes showed an increase in palliative care consult rates to those patients presenting with triggers, a decreased mean time from admission to palliative consult, and increased recognition of unmet palliative care needs by MICU nurses and providers. Conclusions: Anticipatory findings of this QI project would suggest a positive correlation between utilizing palliative care trigger criteria and decreased time to palliative care consult. The direct outcomes of effective palliative care results in decreased length of stay, healthcare costs, and moral distress, as well as improved symptom management and quality of life (QOL).

Keywords: palliative care, nursing, quality improvement, trigger tool

Procedia PDF Downloads 158
29016 Overcoming the Impacts of Covid-19 Outbreak Using Value Integrated Project Delivery Model

Authors: G. Ramya

Abstract:

Value engineering is a systematic approach, widely used to optimize the design or process or product in the designing stage. It used to achieve the client's obligation by increasing the functionality and attain the targeted cost in the cost planning. Value engineering effectiveness and benefits decrease along with the progress of the project since the change in the scope of the work and design will account for more cost all along the lifecycle of the project. Integrating the value engineering with other project management activities will promote cost minimization, client satisfaction, and ensure early completion of the project in time. Previous research studies suggested that value engineering can integrate with other project delivery activities, but research studies unable to frame a model that collaborates the project management activities with the job plan of value engineering approach. I analyzed various project management activities and their synergy between each other. The project management activities and processes like a)risk analysis b)lifecycle cost analysis c)lean construction d)facility management e)Building information modelling f)Contract administration, collaborated, and project delivery model planned along with the RIBA plan of work. The key outcome of the research is a value-driven project delivery model, which will succeed in dealing with the economic impact, constraints and conflicts arise due to the COVID-19 outbreak in the Indian construction sector. Benefits associated with the structured framework is construction project delivery that ensures early contractor involvement, mutual risk sharing, and reviving the project with a cost overrun and delay back on track ,are discussed. Keywords: Value-driven project delivery model, Integration, RIBA plan of work Themes: Design Economics

Keywords: value-driven project delivery model, Integration, RIBA

Procedia PDF Downloads 95
29015 Sun-Driven Evaporation Enhanced Forward Osmosis Process for Application in Wastewater Treatment and Pure Water Regeneration

Authors: Dina Magdy Abdo, Ayat N. El-Shazly, E. A. Abdel-Aal

Abstract:

Forward osmosis (FO) is one of the important processes during the wastewater treatment system for environmental remediation and fresh water regeneration. Both Egypt and China are troubled by over millions of tons of wastewater every year, including domestic and industrial wastewater. However, the traditional FO process in wastewater treatment usually suffers low efficiency and high energy consumption because of the continuously diluted draw solution. An additional concentration process is necessary to keep running of FO separation, causing energy waste. Based on the previous study on photothermal membrane, a sun-driven evaporation process is integrated into the draw solution side of FO system. During the sun-driven evaporation, not only the draw solution can be concentrated to maintain a stable and sustainable FO system, but fresh water can be directly separated for regeneration. Solar energy is the ultimate energy source of everything we have on Earth and is, without any doubt, the most renewable and sustainable energy source available to us. Additionally, the FO membrane process is rationally designed to limit the concentration polarization and fouling. The FO membrane’s structure and surface property will be further optimized by the adjustment of doping ratio of controllable nano-materials, membrane formation conditions, and selection of functional groups. A novel kind of nano-composite functional separation membrane with bi-interception layers and high hydrophilicity will be developed for the application in wastewater treatment. So, herein we aim to design a new wastewater treatment system include forward osmosis with high-efficiency energy recovery via the integration of photothermal membrane.

Keywords: forward osmosis, membrane, solar, water treatement

Procedia PDF Downloads 69
29014 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model

Authors: Amit R. Bhende, G. K. Awari

Abstract:

Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.

Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis

Procedia PDF Downloads 408
29013 Numerical Study on Self-Confined Plasmoid Transport Phenomena in an Electrodeless Plasma Thruster for Space Propulsion

Authors: Xiaodong Wen, Lijuan Liu, Xinfeng Sun

Abstract:

A high power electrodeless plasma thruster is being developed at Lanzhou Institute of Physics. In this thruster, a rotating magnetic field (RMF) driven by two radio-frequency coils which dephased by 90 degrees are applied both for propellant ionization and plasma acceleration. In the ionization stage, a very high azimuthal current can be driven by RMF and then makes plasma forms a field reversed configuration, namely self-confined plasmoid. Profoundly understanding the transport characteristics of the plasmoid in the following acceleration stage is the key to improve the thruster performances. In this paper, a 3D MHD model is established and the influences of the RMF and an applied magnetic field on the self-confined plasmoid acceleration are investigated. The simulation results show that, by applying a RMF with strength and frequency of 250 G and 370 kHz, the plasmoid can be accelerated to an average velocity of 17 km/s at the exit of the thruster.

Keywords: electric space propulsion, field reversed configuration, rotating magnetic field, transport phenomena

Procedia PDF Downloads 103
29012 Time Driven Activity Based Costing Capability to Improve Logistics Performance: Application in Manufacturing Context

Authors: Siham Rahoui, Amr Mahfouz, Amr Arisha

Abstract:

In a highly competitive environment characterised by uncertainty and disruptions, such as the recent COVID-19 outbreak, supply chains (SC) face the challenge of maintaining their cost at minimum levels while continuing to provide customers with high-quality products and services. More importantly, businesses in such an economic context strive to maintain survival by keeping the cost of undertaken activities (such as logistics) low and in-house. To do so, managers need to understand the costs associated with different products and services in order to have a clear vision of the SC performance, maintain profitability levels, and make strategic decisions. In this context, SC literature explored different costing models that sought to determine the costs of undertaking supply chain-related activities. While some cost accounting techniques have been extensively explored in the SC context, more contributions are needed to explore the potential of time driven activity-based costing (TDABC). More specifically, more applications are needed in the manufacturing context of the SC, where the debate is ongoing. The aim of the study is to assess the capability of the technique to assess the operational performance of the logistics function. Through a case study methodology applied to a manufacturing company operating in the automotive industry, TDABC evaluates the efficiency of the current configuration and its logistics processes. The study shows that monitoring the process efficiency and cost efficiency leads to strategic decisions that contributed to improve the overall efficiency of the logistics processes.

Keywords: efficiency, operational performance, supply chain costing, time driven activity based costing

Procedia PDF Downloads 121
29011 The Evaluation of Current Pile Driving Prediction Methods for Driven Monopile Foundations in London Clay

Authors: John Davidson, Matteo Castelletti, Ismael Torres, Victor Terente, Jamie Irvine, Sylvie Raymackers

Abstract:

The current industry approach to pile driving predictions consists of developing a model of the hammer-pile-soil system which simulates the relationship between soil resistance to driving (SRD) and blow counts (or pile penetration per blow). The SRD methods traditionally used are broadly based on static pile capacity calculations. The SRD is used in combination with the one-dimensional wave equation model to indicate the anticipated blowcounts with depth for specific hammer energy settings. This approach has predominantly been calibrated on relatively long slender piles used in the oil and gas industry but is now being extended to allow calculations to be undertaken for relatively short rigid large diameter monopile foundations. This paper evaluates the accuracy of current industry practice when applied to a site where large diameter monopiles were installed in predominantly stiff fissured clay. Actual geotechnical and pile installation data, including pile driving records and signal matching analysis (based upon pile driving monitoring techniques), were used for the assessment on the case study site.

Keywords: driven piles, fissured clay, London clay, monopiles, offshore foundations

Procedia PDF Downloads 194
29010 Advances in Mathematical Sciences: Unveiling the Power of Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid advancements in data collection, storage, and processing capabilities have led to an explosion of data in various domains. In this era of big data, mathematical sciences play a crucial role in uncovering valuable insights and driving informed decision-making through data analytics. The purpose of this abstract is to present the latest advances in mathematical sciences and their application in harnessing the power of data analytics. This abstract highlights the interdisciplinary nature of data analytics, showcasing how mathematics intersects with statistics, computer science, and other related fields to develop cutting-edge methodologies. It explores key mathematical techniques such as optimization, mathematical modeling, network analysis, and computational algorithms that underpin effective data analysis and interpretation. The abstract emphasizes the role of mathematical sciences in addressing real-world challenges across different sectors, including finance, healthcare, engineering, social sciences, and beyond. It showcases how mathematical models and statistical methods extract meaningful insights from complex datasets, facilitating evidence-based decision-making and driving innovation. Furthermore, the abstract emphasizes the importance of collaboration and knowledge exchange among researchers, practitioners, and industry professionals. It recognizes the value of interdisciplinary collaborations and the need to bridge the gap between academia and industry to ensure the practical application of mathematical advancements in data analytics. The abstract highlights the significance of ongoing research in mathematical sciences and its impact on data analytics. It emphasizes the need for continued exploration and innovation in mathematical methodologies to tackle emerging challenges in the era of big data and digital transformation. In summary, this abstract sheds light on the advances in mathematical sciences and their pivotal role in unveiling the power of data analytics. It calls for interdisciplinary collaboration, knowledge exchange, and ongoing research to further unlock the potential of mathematical methodologies in addressing complex problems and driving data-driven decision-making in various domains.

Keywords: mathematical sciences, data analytics, advances, unveiling

Procedia PDF Downloads 61
29009 Ontology-Driven Generation of Radiation Protection Procedures

Authors: Chamseddine Barki, Salam Labidi, Hanen Boussi Rahmouni

Abstract:

In this article, we present the principle and suitable methodology for the design of a medical ontology that highlights the radiological and dosimetric knowledge, applied in diagnostic radiology and radiation-therapy. Our ontology, which we named «Onto.Rap», is the subject of radiation protection in medical and radiology centers by providing a standardized regulatory oversight. Thanks to its added values of knowledge-sharing, reuse and the ease of maintenance, this ontology tends to solve many problems. Of which we name the confusion between radiological procedures a practitioner might face while performing a patient radiological exam. Adding to it, the difficulties they might have in interpreting applicable patient radioprotection standards. Here, the ontology, thanks to its concepts simplification and expressiveness capabilities, can ensure an efficient classification of radiological procedures. It also provides an explicit representation of the relations between the different components of the studied concept. In fact, an ontology based-radioprotection expert system, when used in radiological center, could implement systematic radioprotection best practices during patient exam and a regulatory compliance service auditing afterwards.

Keywords: knowledge, ontology, radiation protection, radiology

Procedia PDF Downloads 283
29008 Sun-Driven Evaporation Enhanced Forward Osmosis Process for Application in Wastewater Treatment and Pure Water Regeneration

Authors: Dina Magdy Abdo, Ayat N. El-Shazly, Hamdy Maamoun Abdel-Ghafar, E. A. Abdel-Aal

Abstract:

Forward osmosis (FO) is one of the important processes during the wastewater treatment system for environmental remediation and fresh water regeneration. Both Egypt and China are troubled by over millions of tons of wastewater every year, including domestic and industrial wastewater. However, traditional FO process in wastewater treatment usually suffers low efficiency and high energy consumption because of the continuously diluted draw solution. An additional concentration process is necessary to keep running of FO separation, causing energy waste. Based on the previous study on photothermal membrane, a sun-driven evaporation process is integrated into the draw solution side of FO system. During the sun-driven evaporation, not only the draw solution can be concentrated to maintain a stable and sustainable FO system, but fresh water can be directly separated for regeneration. Solar energy is the ultimate energy source of everything we have on Earth and is, without any doubt, the most renewable and sustainable energy source available to us. Additionally, the FO membrane process is rationally designed to limit the concentration polarization and fouling. The FO membrane’s structure and surface property will be further optimized by the adjustment of the doping ratio of controllable nano-materials, membrane formation conditions, and selection of functional groups. A novel kind of nano-composite functional separation membrane with bi-interception layers and high hydrophilicity will be developed for the application in wastewater treatment. So, herein we aim to design a new wastewater treatment system include forward osmosis with high-efficiency energy recovery via the integration of photothermal membrane.

Keywords: forword, membrane, solar, water treatment

Procedia PDF Downloads 55
29007 Applying Program Theory-Driven Approach to Design and Evaluate a Teacher Professional Development Program

Authors: S. C. Lin, M. S. Wu

Abstract:

Japanese Scholar Manabu Sato has been advocating the Learning Community, which changed Japanese fundamental education during the last three decades. It was also called a “Quiet Revolution.” Manabu Sato criticized that traditional education only focused on individual competition, exams, teacher-centered instruction, and memorization. The students lacked leaning motivation. Therefore, Manabu Sato proclaimed that learning should be a sustainable process of “constantly weaving the relationship and the meanings” by having dialogues with learning materials, with peers, and with oneself. For a long time, secondary school education in Taiwan has been focused on exams and emphasized reciting and memorizing. The incident of “giving up learning” happened to some students. Manabu Sato’s learning community program has been implemented very successfully in Japan. It is worth exploring if learning community can resolve the issue of “Escape from learning” phenomenon among secondary school students in Taiwan. This study was the first year of a two-year project. This project applied a program theory-driven approach to evaluating the impact of teachers’ professional development interventions on students’ learning by using a mix of methods, qualitative inquiry, and quasi-experimental design. The current study was to show the results of using the method of theory-driven approach to program planning to design and evaluate a teachers’ professional development program (TPDP). The Manabu Sato’s learning community theory was applied to structure all components of a 54-hour workshop. The participants consisted of seven secondary school science teachers from two schools. The research procedure was comprised of: 1) Defining the problem and assessing participants’ needs; 2) Selecting the Theoretical Framework; 3) Determining theory-based goals and objectives; 4) Designing the TPDP intervention; 5) Implementing the TPDP intervention; 6) Evaluating the TPDP intervention. Data was collected from a number of different sources, including TPDP checklist, activity responses of workshop, LC subject matter test, teachers’ e-portfolio, course design documents, and teachers’ belief survey. The major findings indicated that program design was suitable to participants. More than 70% of the participants were satisfied with program implementation. They revealed that TPDP was beneficial to their instruction and promoted their professional capacities. However, due to heavy teaching loadings during the project some participants were unable to attend all workshops. To resolve this problem, the author provided options to them by watching DVD or reading articles offered by the research team. This study also established a communication platform for participants to share their thoughts and learning experiences. The TPDP had marked impacts on participants’ teaching beliefs. They believe that learning should be a sustainable process of “constantly weaving the relationship and the meanings” by having dialogues with learning materials, with peers, and with oneself. Having learned from TPDP, they applied a “learner-centered” approach and instructional strategies to design their courses, such as learning by doing, collaborative learning, and reflective learning. To conclude, participants’ beliefs, knowledge, and skills were promoted by the program instructions.

Keywords: program theory-driven approach, learning community, teacher professional development program, program evaluation

Procedia PDF Downloads 287
29006 A Data-Driven Agent Based Model for the Italian Economy

Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio

Abstract:

We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.

Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data

Procedia PDF Downloads 33
29005 Conceptualizing the Knowledge to Manage and Utilize Data Assets in the Context of Digitization: Case Studies of Multinational Industrial Enterprises

Authors: Martin Böhmer, Agatha Dabrowski, Boris Otto

Abstract:

The trend of digitization significantly changes the role of data for enterprises. Data turn from an enabler to an intangible organizational asset that requires management and qualifies as a tradeable good. The idea of a networked economy has gained momentum in the data domain as collaborative approaches for data management emerge. Traditional organizational knowledge consequently needs to be extended by comprehensive knowledge about data. The knowledge about data is vital for organizations to ensure that data quality requirements are met and data can be effectively utilized and sovereignly governed. As this specific knowledge has been paid little attention to so far by academics, the aim of the research presented in this paper is to conceptualize it by proposing a “data knowledge model”. Relevant model entities have been identified based on a design science research (DSR) approach that iteratively integrates insights of various industry case studies and literature research.

Keywords: data management, digitization, industry 4.0, knowledge engineering, metamodel

Procedia PDF Downloads 326
29004 Sales Patterns Clustering Analysis on Seasonal Product Sales Data

Authors: Soojin Kim, Jiwon Yang, Sungzoon Cho

Abstract:

As a seasonal product is only in demand for a short time, inventory management is critical to profits. Both markdowns and stockouts decrease the return on perishable products; therefore, researchers have been interested in the distribution of seasonal products with the aim of maximizing profits. In this study, we propose a data-driven seasonal product sales pattern analysis method for individual retail outlets based on observed sales data clustering; the proposed method helps in determining distribution strategies.

Keywords: clustering, distribution, sales pattern, seasonal product

Procedia PDF Downloads 568
29003 A Grounded Theory on Marist Spirituality/Charism from the Perspective of the Lay Marists in the Philippines

Authors: Nino M. Pizarro

Abstract:

To the author’s knowledge, despite the written documents about Marist spirituality/charism, nothing has been done concerning a clear theoretical framework that highlights Marist spirituality/charism from the perspective or lived experience of the lay Marists of St. Marcellin Champagnat. The participants of the study are the lay Marist - educators who are from Marist Schools in the Philippines. Since the study would like to find out the respondents’ own concepts and meanings about Marist spirituality/charism, qualitative methodology is considered the approach to be used in the study. In particular, the study will use the qualitative methods of Barney Glaser. The theory will be generated systematically from data collection, coding and analyzing through memoing, theoretical sampling, sorting and writing and using the constant comparative method. The data collection method that will be employed in this grounded theory research is the in-depth interview that is semi-structured and participant driven. Data collection will be done through snowball sampling that is purposive. The study is considering to come up with a theoretical framework that will help the lay Marists to deepen their understanding of the Marist spirituality/charism and their vocation as lay partners of the Marist Brothers of the Schools.

Keywords: grounded theory, Lay Marists, lived experience, Marist spirituality/charism

Procedia PDF Downloads 277
29002 Conceptualising Queercide: A Quantitative Desktop Exploration of the Technical Frames Used in Online Repors of Lesbian Killings in South Africa

Authors: Marchant Van Der Schyff

Abstract:

South Africa remains one of the most dangerous places for women – lesbians in particular – to live freely and safely, where a culture of patriarchy and a lack of socio-economic opportunity are ubiquitous throughout its communities. While the Internet has given a wider platform to provide insights to issues plaguing lesbians, very little information exists regarding the elements used in the construction of these online reports. This is not only due to the lack of language required to contextualise lesbian issues, but also persistent institutional and societal homophobia. This article describes the technical frames used in the online news reporting of four case studies of ‘queercide’. Through using a thematic coding sheet, data was collected from 70 online articles purposively selected based on priori population characteristics. The study found technical elements, such as the length of online reports, credible sources used, ‘code driven’-, and ‘user driven’ elements which were identified in the coded online articles. From the conclusions some clear trends emerged enabling the construction of a Venn-type diagram which present insights to how the murder of lesbians (referred to as ‘queercide’ in the article) is being reported on by online news media compared to the contemporary theoretical discussions on how these cases should be reported on.

Keywords: journalism, lesbian murder, queercide, technical frames, reporting, online

Procedia PDF Downloads 48
29001 The Effect of Mean Pressure on the Performance of a Low-Grade Heat-Driven Thermoacoustic Cooler

Authors: Irna Farikhah

Abstract:

Converting low-grade waste heat into useful energy such as sound energy which can then be used to generate acoustic power in a thermoacoustic engine has become an attracting issue for researchers. The generated power in thermoacoustic engine can be used for driving a thermoacoustic cooler when they are installed in a tube. This cooler system can be called as a heat-driven thermoacoustic cooler. In this study, low heating temperature of the engine is discussed. In addition, having high efficiency of the whole cooler is also essential. To design a thermoacoustic cooler having high efficiency with using low-grade waste heat for the engine, the effect of mean pressure is investigated. By increasing the mean pressure, the heating temperature to generate acoustic power can be decreased from 557 °C to 300 °C. Moreover, the efficiency of the engine and cooler regenerators attain 67% and 47% of the upper limit values, respectively and 49% of the acoustical work generated by the engine regenerator is utilized in the cooler regenerator. As a result, the efficiency of the whole cooler becomes 15% of the upper limit value.

Keywords: cooler, mean pressure, performance, thermoacoustic

Procedia PDF Downloads 230
29000 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles

Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis

Abstract:

Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.

Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review

Procedia PDF Downloads 128
28999 Beyond Diagnosis: Innovative Instructional Methods for Children with Multiple Disabilities

Authors: Patricia Kopetz

Abstract:

Too often our youngest children with disabilities receive diagnostic labels and accompanying treatment plans based upon perceptions that the children are of limited aptitude and/or ambition. However, children of varied-ability levels who are diagnosed with ‘multiple disabilities,’ can participate and excel in school-based instruction that aligns with their desires, interests, and fortitude – criteria components not foretold by scores on standardized assessments. The paper represents theoretical work in Special Education Innovative Instruction, and includes presenting research materials, some developed by the author herself. The majority of students with disabilities are now served in general education settings in the United States, embracing inclusive practices in our schools. ‘There is now a stronger call for special education to step up and improve efficiency, implement evidence-based practices, and provide greater accountability on key performance indicators that support successful academic and post-school outcomes for students with disabilities.’ For example, in the United States, the Office of Special Education Programs (OSEP) is focusing on results-driven indicators to improve outcomes for students with disabilities. School personnel are appreciating the implications of research-driven approaches for students diagnosed with multiple disabilities, and aim to align their practices toward such focus. The paper presented will provide updates on current theoretical principles and perspectives, and explore advancements in latest, evidence-based and results-driven instructional practices that can motivate children with multiple disabilities to advance their skills and engage in learning activities that as nonconventional, innovative, and proven successful.

Keywords: childhood special education, educational technology , innovative instruction, multiple disabilities

Procedia PDF Downloads 221
28998 Desalination Performance of a Passive Solar-Driven Membrane Distiller: Effect of Middle Layer Material and Thickness

Authors: Glebert C. Dadol, Pamela Mae L. Ucab, Camila Flor Y. Lobarbio, Noel Peter B. Tan

Abstract:

Water scarcity is a global problem and membrane-based desalination technologies are one of the promising solutions to this problem. In this study, a passive solar-driven membrane distiller was fabricated and tested for its desalination performance. The distiller was composed of a TiNOX plate solar absorber, cellulose-based upper and lower hydrophilic layers, a hydrophobic middle layer, and aluminum heatsinks. The effect of the middle layer material and thickness on the desalination performance was investigated in terms of distillate productivity and salinity. The materials used for the middle layer were a screen mesh (2 mm, 4 mm, 6 mm thickness) to generate an air gap, a PTFE membrane (0.3 mm thickness)), and a combination of the screen mesh and the PTFE membrane (2.3 mm total thickness). Salt water (35 g/L NaCl) was desalinated using the distiller at a rooftop setting at the University of San Carlos, Cebu City, Philippines. The highest distillate productivity of 1.08 L/m2-h was achieved using a 2-mm screen mesh (air gap) but it also resulted in a high distillate salinity of 25.20 g/L. Increasing the thickness of the air gap lowered the distillate salinity but also decreased the distillate productivity. The lowest salinity of 1.07 g/L was achieved using a 6-mm air gap but the productivity was reduced to 0.08 L/m2-h. The use of the hydrophobic PTFE membrane increased the productivity (0.44 L/m2-h) compared to a 6-mm air gap but produced a distillate with high salinity (16.68 g/L). When using a combination of the screen mesh and the PTFE membrane, the productivity was 0.13 L/m2-h and a distillate salinity of 1.61 g/L. The distiller with a thick air gap as the middle layer can deliver a distillate with low salinity and is preferred over a thin hydrophobic PTFE membrane. The use of a combination of the air gap and PTFE membrane slightly increased the productivity with comparable distillate salinity. Modifications and optimizations to the distiller can be done to improve further its performance.

Keywords: desalination, membrane distillation, passive solar-driven membrane distiller, solar distillation

Procedia PDF Downloads 84
28997 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 242
28996 Numerical Study of Mixed Convection Coupled to Radiation in a Square Cavity with a Lid-Driven

Authors: Belmiloud Mohamed Amine, Sad Chemloul Nord-Eddine

Abstract:

In this study we investigated numerically heat transfer by mixed convection coupled to radiation in a square cavity; the upper horizontal wall is movable. The purpose of this study is to see the influence of the emissivity and the varying of the Richardson number on the variation of the average Nusselt number. The vertical walls of the cavity are differentially heated, the left wall is maintained at a uniform temperature higher than the right wall, and the two horizontal walls are adiabatic. The finite volume method is used for solving the dimensionless governing equations. Emissivity values used in this study are ranged between 0 and 1, the Richardson number in the range 0.1 to10. The Rayleigh number is fixed to Ra = 10000 and the Prandtl number is maintained constant Pr = 0.71. Streamlines, isothermal lines and the average Nusselt number are presented according to the surface emissivity. The results of this study show that the Richardson number and emissivity affect the average Nusselt number.

Keywords: mixed convection, square cavity, wall emissivity, lid-driven, numerical study

Procedia PDF Downloads 299
28995 Dual Metal Organic Framework Derived N-Doped Fe3C Nanocages Decorated with Ultrathin ZnIn2S4 Nanosheets for Efficient Photocatalytic Hydrogen Generation

Authors: D. Amaranatha Reddy

Abstract:

Highly efficient and stable co-catalysts materials is of great important for boosting photo charge carrier’s separation, transportation efficiency, and accelerating the catalytic reactive sites of semiconductor photocatalysts. As a result, it is of decisive importance to fabricate low price noble metal free co-catalysts with high catalytic reactivity, but it remains very challenging. Considering this challenge here, dual metal organic frame work derived N-Doped Fe3C nanocages have been rationally designed and decorated with ultrathin ZnIn2S4 nanosheets for efficient photocatalytic hydrogen generation. The fabrication strategy precisely integrates co-catalyst nanocages with ultrathin two-dimensional (2D) semiconductor nanosheets by providing tightly interconnected nano-junctions and helps to suppress the charge carrier’s recombination rate. Furthermore, constructed highly porous hybrid structures expose ample active sites for catalytic reduction reactions and harvest visible light more effectively by light scattering. As a result, fabricated nanostructures exhibit superior solar driven hydrogen evolution rate (9600 µmol/g/h) with an apparent quantum efficiency of 3.6 %, which is relatively higher than the Pt noble metal co-catalyst systems and earlier reported ZnIn2S4 based nanohybrids. We believe that the present work promotes the application of sulfide based nanostructures in solar driven hydrogen production.

Keywords: photocatalysis, water splitting, hydrogen fuel production, solar-driven hydrogen

Procedia PDF Downloads 105
28994 Numerical Analysis of a Pilot Solar Chimney Power Plant

Authors: Ehsan Gholamalizadeh, Jae Dong Chung

Abstract:

Solar chimney power plant is a feasible solar thermal system which produces electricity from the Sun. The objective of this study is to investigate buoyancy-driven flow and heat transfer through a built pilot solar chimney system called 'Kerman Project'. The system has a chimney with the height and diameter of 60 m and 3 m, respectively, and the average radius of its solar collector is about 20 m, and also its average collector height is about 2 m. A three-dimensional simulation was conducted to analyze the system, using computational fluid dynamics (CFD). In this model, radiative transfer equation was solved using the discrete ordinates (DO) radiation model taking into account a non-gray radiation behavior. In order to modelling solar irradiation from the sun’s rays, the solar ray tracing algorithm was coupled to the computation via a source term in the energy equation. The model was validated with comparing to the experimental data of the Manzanares prototype and also the performance of the built pilot system. Then, based on the numerical simulations, velocity and temperature distributions through the system, the temperature profile of the ground surface and the system performance were presented. The analysis accurately shows the flow and heat transfer characteristics through the pilot system and predicts its performance.

Keywords: buoyancy-driven flow, computational fluid dynamics, heat transfer, renewable energy, solar chimney power plant

Procedia PDF Downloads 227