Search results for: integrated continuous improvement framework
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13233

Search results for: integrated continuous improvement framework

12393 Transdisciplinary Pedagogy: An Arts-Integrated Approach to Promote Authentic Science, Technology, Engineering, Arts, and Mathematics Education in Initial Teacher Education

Authors: Anne Marie Morrin

Abstract:

This paper will focus on the design, delivery and assessment of a transdisciplinary STEAM (Science, Technology, Engineering, Arts, and Mathematics) education initiative in a college of education in Ireland. The project explores a transdisciplinary approach to supporting STEAM education where the concepts, methodologies and assessments employed derive from visual art sessions within initial teacher education. The research will demonstrate that the STEAM Education approach is effective when visual art concepts and methods are placed at the core of the teaching and learning experience. Within this study, emphasis is placed on authentic collaboration and transdisciplinary pedagogical approaches with the STEAM subjects. The partners included a combination of teaching expertise in STEM and Visual Arts education, artists, in-service and pre-service teachers and children. The inclusion of all stakeholders mentioned moves towards a more authentic approach where transdisciplinary practice is at the core of the teaching and learning. Qualitative data was collected using a combination of questionnaires (focused and open-ended questions) and focus groups. In addition, the data was collected through video diaries where students reflected on their visual journals and transdisciplinary practice, which gave rich insight into participants' experiences and opinions on their learning. It was found that an effective program of STEAM education integration was informed by co-teaching (continuous professional development), which involved a commitment to adaptable and flexible approaches to teaching, learning, and assessment, as well as the importance of continuous reflection-in-action by all participants. The delivery of a transdisciplinary model of STEAM education was devised to reconceptualizatise how individual subject areas can develop essential skills and tackle critical issues (such as self-care and climate change) through data visualisation and technology. The success of the project can be attributed to the collaboration, which was inclusive, flexible and a willingness between various stakeholders to be involved in the design and implementation of the project from conception to completion. The case study approach taken is particularistic (focusing on the STEAM-ED project), descriptive (providing in-depth descriptions from varied and multiple perspectives), and heuristic (interpreting the participants’ experiences and what meaning they attributed to their experiences).

Keywords: collaboration, transdisciplinary, STEAM, visual arts education

Procedia PDF Downloads 46
12392 Implications of Creating a 3D Vignette as a Reflective Practice for Continuous Professional Development of Foreign Language Teachers

Authors: Samiah H. Ghounaim

Abstract:

The topic of this paper is significant because of the increasing need for intercultural training for foreign language teachers due to the continuous challenges they face in their diverse classrooms. First, the structure of the intercultural training program designed will be briefly described, and the structure of a 3D vignette and its intended purposes will be elaborated on. This was the first stage where the program was designed and implemented on the period of three months with a group of local and expatriate foreign language teachers/practitioners at a university in the Middle East. After that, a set of primary data collected during the first stage of this research on the design and co-construction process of a 3D vignette will be reviewed and analysed in depth. Each practitioner designed a personal incident into a 3D vignette where each dimension of the vignette viewed the same incident from a totally different perspective. Finally, the results and the implications of having participant construct their personal incidents into a 3D vignette as a reflective practice will be discussed in detail as well as possible extensions for the research. This process proved itself to be an effective reflective practice where the participants were stimulated to view their incidents in a different light. Co-constructing one’s own critical incidents –be it a positive experience or not– into a structured 3D vignette encouraged participants to decentralise themselves from the incidents and, thus, creating a personal reflective space where they had the opportunity to see different potential outcomes for each incident, as well as prepare for the reflective discussion of their vignette with their peers. This provides implications for future developments in reflective writing practices and possibilities for educators’ continuous professional development (CPD).

Keywords: 3D vignettes, intercultural competence training, reflective practice, teacher training

Procedia PDF Downloads 102
12391 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies

Keywords: crop yield, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 403
12390 Automated Ultrasound Carotid Artery Image Segmentation Using Curvelet Threshold Decomposition

Authors: Latha Subbiah, Dhanalakshmi Samiappan

Abstract:

In this paper, we propose denoising Common Carotid Artery (CCA) B mode ultrasound images by a decomposition approach to curvelet thresholding and automatic segmentation of the intima media thickness and adventitia boundary. By decomposition, the local geometry of the image, its direction of gradients are well preserved. The components are combined into a single vector valued function, thus removes noise patches. Double threshold is applied to inherently remove speckle noise in the image. The denoised image is segmented by active contour without specifying seed points. Combined with level set theory, they provide sub regions with continuous boundaries. The deformable contours match to the shapes and motion of objects in the images. A curve or a surface under constraints is developed from the image with the goal that it is pulled into the necessary features of the image. Region based and boundary based information are integrated to achieve the contour. The method treats the multiplicative speckle noise in objective and subjective quality measurements and thus leads to better-segmented results. The proposed denoising method gives better performance metrics compared with other state of art denoising algorithms.

Keywords: curvelet, decomposition, levelset, ultrasound

Procedia PDF Downloads 337
12389 Study of the Phenomenon of Collapse and Buckling the Car Body Frame

Authors: Didik Sugiyanto

Abstract:

Conditions that often occur in the framework of a particular vehicle at a car is a collision or collision with another object, an example of such damage is to the frame or chassis for the required design framework that is able to absorb impact energy. Characteristics of the material are influenced by the value of the stiffness of the material that need to be considered in choosing the material properties of the material. To obtain material properties that can be adapted to the experimental conditions tested the tensile and compression testing. In this study focused on the chassis at an angle of 150, 300, and 450. It is based on field studies that vehicle primarily for freight cars have a point of order light between 150 to 450. Research methods include design tools, design framework, procurement of materials and experimental tools, tool-making, the manufacture of the test framework, and the testing process, experiment is testing the power of the press to know the order. From this test obtained the maximum force on the corner of 150 was 569.76 kg at a distance of 16 mm, angle 300 is 370.3 kg at a distance of 15 mm, angle 450 is 391.71 kg at a distance of 28 mm. After reaching the maximum force the order will occur collapse, followed by a decrease in the next distance. It can be concluded that the greatest strain energy occurs at an angle of 150. So it is known that the frame at an angle of 150 produces the best level of security.

Keywords: buckling, collapse, body frame, vehicle

Procedia PDF Downloads 574
12388 A General Framework for Knowledge Discovery from Echocardiographic and Natural Images

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, Bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 435
12387 A Theoretical Framework for Design Theories in Mobile Learning: A Higher Education Perspective

Authors: Paduri Veerabhadram, Antoinette Lombard

Abstract:

In this paper a framework for hypothesizing about mobile learning to complement theories of formal and informal learning is presented. As such, activity theory will form the main theoretical lens through which the elements involved in formal and informal learning for mobile learning will be explored, specifically related to context-aware mobile learning application. The author believes that the complexity of the relationships involved can best be analysed using activity theory. Activity theory, as a social, cultural and activity theory can be used as a mobile learning framework in an academic environment, but to develop an optimal artifact, through investigation of inherent system's contradictions. As such, it serves as a powerful modelling tool to explore and understand the design of a mobile learning environment in the study’s environment. The Academic Tool Kit Framework (ATKF) as also employed for designing of a constructivism learning environment, effective in assisting universities to facilitate lecturers to effectively implement learning through utilizing mobile devices. Results indicate a positive perspective of students in the use of mobile devices for formal and informal learning, based on the context-aware learning environment developed through the use of activity theory and ATKF.

Keywords: collaborative learning, cooperative learning, context-aware learning environment, mobile learning, pedagogy

Procedia PDF Downloads 555
12386 Factors of Influence in Software Process Improvement: An ISO/IEC 29110 for Very-Small Entities

Authors: N. Wongsai, R. Wetprasit, V. Siddoo

Abstract:

The recently introduced ISO/IEC 29110 standard Lifecycle profile for Very Small Entities (VSE) has been adopted and practiced in many small and medium software companies, including in Thailand’s software industry. Many Thai companies complete their software process improvement (SPI) initiative program and have been certified. There are, however, a number of participants fail to success. This study was concerned with the factors that influence the accomplishment of the standard implementation in various VSE characteristics. In order to achieve this goal, exploring and extracting critical factors from prior studies were carried out and then the obtained factors were validated by the standard experts. Data analysis of comments and recommendations was performed using a qualitative content analysis method. This paper presents the initial set of influence factors in both positive and negative impact the ISO/IEC 29110 implementation with an aim at helping such SPI practitioners with some considerations to manage appropriate adoption approach in order to achieve its implementation.

Keywords: barriers, critical success factors, ISO/IEC 29110, Software Process Improvement, SPI, Very-Small Entity, VSE

Procedia PDF Downloads 313
12385 Aliasing Free and Additive Error in Spectra for Alpha Stable Signals

Authors: R. Sabre

Abstract:

This work focuses on the symmetric alpha stable process with continuous time frequently used in modeling the signal with indefinitely growing variance, often observed with an unknown additive error. The objective of this paper is to estimate this error from discrete observations of the signal. For that, we propose a method based on the smoothing of the observations via Jackson polynomial kernel and taking into account the width of the interval where the spectral density is non-zero. This technique allows avoiding the “Aliasing phenomenon” encountered when the estimation is made from the discrete observations of a process with continuous time. We have studied the convergence rate of the estimator and have shown that the convergence rate improves in the case where the spectral density is zero at the origin. Thus, we set up an estimator of the additive error that can be subtracted for approaching the original signal without error.

Keywords: spectral density, stable processes, aliasing, non parametric

Procedia PDF Downloads 126
12384 [Keynote Talk]: From Clinical Practice to Academic Setup, 'Quality Circles' for Quality Outputs in Both

Authors: Vandita Mishra

Abstract:

From the management of patients, reception, record, and assistants in a clinical practice; to the management of ongoing research, clinical cases and department profile in an academic setup, the healthcare provider has to deal with all of it. The victory lies in smooth running of the show in both the above situations with an apt solution of problems encountered and smooth management of crisis faced. Thus this paper amalgamates dental science with health administration by means of introduction of a concept for practice management and problem-solving called 'Quality Circles'. This concept uses various tools for problem solving given by experts from different fields. QC tools can be applied in both clinical and academic settings in dentistry for better productivity and for scientifically approaching the process of continuous improvement in both the categories. When approached through QC, our organization showed better patient outcomes and more patient satisfaction. Introduced in 1962 by Kaoru Ishikawa, this tool has been extensively applied in certain fields outside dentistry and healthcare. By exemplification of some clinical cases and virtual scenarios, the tools of Quality circles will be elaborated and discussed upon.

Keywords: academics, dentistry, healthcare, quality

Procedia PDF Downloads 98
12383 Research on Design Methods for Riverside Spaces of Deep-cut Rivers in Mountainous Cities: A Case Study of Qingshuixi River in Chongqing City

Authors: Luojie Tang

Abstract:

Riverside space is an important public space and ecological corridor in urban areas, but mountainous urban rivers are often overlooked due to their deep valleys and poor accessibility. This article takes the Qing Shui Xi River in Chongqing as an example, and through long-term field inspections, measurements, interviews, and online surveys, summarizes the problems of poor accessibility, limited space for renovation, lack of waterfront facilities, excessive artificial intervention, low average runoff, severe river water pollution, and difficulty in integrated watershed management in riverside space. Based on the current situation and drawing on relevant experiences, this article summarizes the design methods for riverside space in deep valley rivers in mountainous urban areas. Regarding spatial design techniques, the article emphasizes the importance of integrating waterfront spaces into the urban public space system and vertical linkages. Furthermore, the article suggests different design methods and improvement strategies for the already developed areas and new development areas. Specifically, the article proposes a planning and design strategy of "protection" and "empowerment" for new development areas and an updating and transformation strategy of "improvement" and "revitalization" for already developed areas. In terms of ecological restoration methods, the article suggests three focus points: increasing the runoff of urban rivers, raising the landscape water level during dry seasons, and restoring vegetation and wetlands in the riverbank buffer zone while protecting the overall pattern of the watershed. Additionally, the article presents specific design details of the Qingshuixi River to illustrate the proposed design and restoration techniques.

Keywords: deep-cut river, design method, mountainous city, Qingshuixi river in Chongqing, waterfront space design

Procedia PDF Downloads 102
12382 An Experimental Study of Low Concentration CO₂ Capture from Regenerative Thermal Oxidation Tail Gas in Rotating Packed Bed

Authors: Dang HuynhMinhTam, Kuang-Cong Lu, Yi-Hung Chen, Zhung-Yu Lin, Cheng-Siang Cheng

Abstract:

Carbon capture, utilization, and storage (CCUS) technology become a predominant technique to mitigate carbon dioxide and achieve net-zero emissions goals. This research targets to continuously capture the low concentration CO₂ from the tail gas of the regenerative thermal oxidizer (RTO) in the high technology industry. A rotating packed bed (RPB) reactor is investigated to capture the efficiency of CO₂ using a mixture of NaOH/Na₂CO₃ solutions to simulate the real absorbed solution. On a lab scale, semi-batch experiments of continuous gas flow and circulating absorbent solution are conducted to find the optimal parameters and are then examined in a continuous operation. In the semi-batch tests, the carbon capture efficiency and pH variation in the conditions of a low concentration CO₂ (about 1.13 vol%), the NaOH concentration of 1 wt% or 2 wt% mixed with 14 wt% Na₂CO₃, the rotating speed (600, 900, 1200 rpm), the gas-liquid ratio (100, 200, and 400), and the temperature of absorbent solution of 40 ºC are studied. The CO₂ capture efficiency significantly increases with higher rotating speed and smaller gas-liquid ratio, respectively, while the difference between the NaOH concentration of 1 wt% and 2 wt% is relatively small. The maximum capture efficiency is close to 80% in the conditions of the NaOH concentration of 1 wt%, the G/L ratio of 100, and the rotating speed of 1200 rpm within the first 5 minutes. Furthermore, the continuous operation based on similar conditions also demonstrates the steady efficiency of the carbon capture of around 80%.

Keywords: carbon dioxide capture, regenerative thermal oxidizer, rotating packed bed, sodium hydroxide

Procedia PDF Downloads 54
12381 A Conceptual Framework for Integrating Musical Instrument Digital Interface Composition in the Music Classroom

Authors: Aditi Kashi

Abstract:

While educational technologies have taken great strides, especially in Musical Instrument Digital Interface (MIDI) composition, teachers across the world are still adjusting to incorporate such technology into their curricula. While using MIDI in the classroom has become more common, limited class time and a strong focus on performance have made composition a lesser priority. The balance between music theory, performance time, and composition learning is delicate and difficult to maintain for many music educators. This makes including MIDI in the classroom. To address this issue, this paper aims to outline a general conceptual framework centered around a key element of music theory to integrate MIDI composition into the music classroom to not only introduce students to digital composition but also enhance their understanding of music theory and its applicability.

Keywords: educational framework, education technology, MIDI, music education

Procedia PDF Downloads 84
12380 The Need for a Consistent Regulatory Framework for CRISPR Gene-Editing in the European Union

Authors: Andrew Thayer, Courtney Rondeau, Paraskevi Papadopoulou

Abstract:

The Clustered Regularly Interspaced Short Palindromic Repeats (CRISPR) gene-editing technologies have generated considerable discussion about the applications and ethics of their use. However, no consistent guidelines for using CRISPR technologies have been developed -nor common legislation passed related to gene editing, especially as it is connected to genetically modified organisms (GMOs) in the European Union. The recent announcement that the first babies with CRISPR-edited genes were born, along with new studies exploring CRISPR’s applications in treating thalassemia, sickle-cell anemia, cancer, and certain forms of blindness, have demonstrated that the technology is developing faster than the policies needed to control it. Therefore, it can be seen that a reasonable and coherent regulatory framework for the use of CRISPR in human somatic and germline cells is necessary to ensure the ethical use of the technology in future years. The European Union serves as a unique region of interconnected countries without a standard set of regulations or legislation for CRISPR gene-editing. We posit that the EU would serve as a suitable model in comparing the legislations of its affiliated countries in order to understand the practicality and effectiveness of adopting majority-approved practices. Additionally, we present a proposed set of guidelines which could serve as a basis in developing a consistent regulatory framework for the EU countries to implement but also act as a good example for other countries to adhere to. Finally, an additional, multidimensional framework of smart solutions is proposed with which all stakeholders are engaged to become better-informed citizens.

Keywords: CRISPR, ethics, regulatory framework, European legislation

Procedia PDF Downloads 131
12379 The Use of Ontology Framework for Automation Digital Forensics Investigation

Authors: Ahmad Luthfi

Abstract:

One of the main goals of a computer forensic analyst is to determine the cause and effect of the acquisition of a digital evidence in order to obtain relevant information on the case is being handled. In order to get fast and accurate results, this paper will discuss the approach known as ontology framework. This model uses a structured hierarchy of layers that create connectivity between the variant and searching investigation of activity that a computer forensic analysis activities can be carried out automatically. There are two main layers are used, namely analysis tools and operating system. By using the concept of ontology, the second layer is automatically designed to help investigator to perform the acquisition of digital evidence. The methodology of automation approach of this research is by utilizing forward chaining where the system will perform a search against investigative steps and atomically structured in accordance with the rules of the ontology.

Keywords: ontology, framework, automation, forensics

Procedia PDF Downloads 335
12378 Post-Earthquake Damage Detection Using System Identification with a Pair of Seismic Recordings

Authors: Lotfi O. Gargab, Ruichong R. Zhang

Abstract:

A wave-based framework is presented for modeling seismic motion in multistory buildings and using measured response for system identification which can be utilized to extract important information regarding structure integrity. With one pair of building response at two locations, a generalized model response is formulated based on wave propagation features and expressed as frequency and time response functions denoted, respectively, as GFRF and GIRF. In particular, GIRF is fundamental in tracking arrival times of impulsive wave motion initiated at response level which is dependent on local model properties. Matching model and measured-structure responses can help in identifying model parameters and infer building properties. To show the effectiveness of this approach, the Millikan Library in Pasadena, California is identified with recordings of the Yorba Linda earthquake of September 3, 2002.

Keywords: system identification, continuous-discrete mass modeling, damage detection, post-earthquake

Procedia PDF Downloads 367
12377 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework

Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi

Abstract:

There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.

Keywords: video lectures, big video data, video retrieval, hadoop

Procedia PDF Downloads 527
12376 Transfigurative Changes of Governmental Responsibility

Authors: Ákos Cserny

Abstract:

The unequivocal increase of the area of operation of the executive power can happen with the appearance of new areas to be influenced and its integration in the power, or at the expense of the scopes of other organs with public authority. The extension of the executive can only be accepted within the framework of the rule of law if parallel with this process we get constitutional guarantees that the exercise of power is kept within constitutional framework. Failure to do so, however, may result in the lack, deficit of democracy and democratic sense, and may cause an overwhelming dominance of the executive power. Therefore, the aim of this paper is to present executive power and responsibility in the context of different dimensions.

Keywords: confidence, constitution, executive power, liabiliy, parliamentarism

Procedia PDF Downloads 397
12375 Obesity and Cancer: Current Scientific Evidence and Policy Implications

Authors: Martin Wiseman, Rachel Thompson, Panagiota Mitrou, Kate Allen

Abstract:

Since 1997 World Cancer Research Fund (WCRF) International and the American Institute for Cancer Research (AICR) have been at the forefront of synthesising and interpreting the accumulated scientific literature on the link between diet, nutrition, physical activity and cancer, and deriving evidence-based Cancer Prevention Recommendations. The 2007 WCRF/AICR 2nd Expert Report was a landmark in the analysis of evidence linking diet, body weight and physical activity to cancer and led to the establishment of the Continuous Update Project (CUP). In 2018, as part of the CUP, WCRF/AICR will publish a new synthesis of the current evidence and update the Cancer Prevention Recommendations. This will ensure that everyone - from policymakers and health professionals to members of the public - has access to the most up-to-date information on how to reduce the risk of developing cancer. Overweight and obesity play a significant role in cancer risk, and rates of both are increasing in many parts of the world. This session will give an overview of new evidence relating obesity to cancer since the 2007 report. For example, since the 2007 Report, the number of cancers for which obesity is judged to be a contributory cause has increased from seven to eleven. The session will also shed light on the well-established mechanisms underpinning obesity and cancer links. Additionally, the session will provide an overview of diet and physical activity related factors that promote positive energy imbalance, leading to overweight and obesity. Finally, the session will highlight how policy can be used to address overweight and obesity at a population level, using WCRF International’s NOURISHING Framework. NOURISHING formalises a comprehensive package of policies to promote healthy diets and reduce obesity and non-communicable diseases; it is a tool for policymakers to identify where action is needed and assess if an approach is sufficiently comprehensive. The framework brings together ten policy areas across three domains: food environment, food system, and behaviour change communication. The framework is accompanied by a regularly updated database providing an extensive overview of implemented government policy actions from around the world. In conclusion, the session will provide an overview of obesity and cancer, highlighting the links seen in the epidemiology and exploring the mechanisms underpinning these, as well as the influences that help determine overweight and obesity. Finally, the session will illustrate policy approaches that can be taken to reduce overweight and obesity worldwide.

Keywords: overweight, obesity, nutrition, cancer, mechanisms, policy

Procedia PDF Downloads 154
12374 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions

Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen

Abstract:

Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.

Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma

Procedia PDF Downloads 170
12373 A Copula-Based Approach for the Assessment of Severity of Illness and Probability of Mortality: An Exploratory Study Applied to Intensive Care Patients

Authors: Ainura Tursunalieva, Irene Hudson

Abstract:

Continuous improvement of both the quality and safety of health care is an important goal in Australia and internationally. The intensive care unit (ICU) receives patients with a wide variety of and severity of illnesses. Accurately identifying patients at risk of developing complications or dying is crucial to increasing healthcare efficiency. Thus, it is essential for clinicians and researchers to have a robust framework capable of evaluating the risk profile of a patient. ICU scoring systems provide such a framework. The Acute Physiology and Chronic Health Evaluation III and the Simplified Acute Physiology Score II are ICU scoring systems frequently used for assessing the severity of acute illness. These scoring systems collect multiple risk factors for each patient including physiological measurements then render the assessment outcomes of individual risk factors into a single numerical value. A higher score is related to a more severe patient condition. Furthermore, the Mortality Probability Model II uses logistic regression based on independent risk factors to predict a patient’s probability of mortality. An important overlooked limitation of SAPS II and MPM II is that they do not, to date, include interaction terms between a patient’s vital signs. This is a prominent oversight as it is likely there is an interplay among vital signs. The co-existence of certain conditions may pose a greater health risk than when these conditions exist independently. One barrier to including such interaction terms in predictive models is the dimensionality issue as it becomes difficult to use variable selection. We propose an innovative scoring system which takes into account a dependence structure among patient’s vital signs, such as systolic and diastolic blood pressures, heart rate, pulse interval, and peripheral oxygen saturation. Copulas will capture the dependence among normally distributed and skewed variables as some of the vital sign distributions are skewed. The estimated dependence parameter will then be incorporated into the traditional scoring systems to adjust the points allocated for the individual vital sign measurements. The same dependence parameter will also be used to create an alternative copula-based model for predicting a patient’s probability of mortality. The new copula-based approach will accommodate not only a patient’s trajectories of vital signs but also the joint dependence probabilities among the vital signs. We hypothesise that this approach will produce more stable assessments and lead to more time efficient and accurate predictions. We will use two data sets: (1) 250 ICU patients admitted once to the Chui Regional Hospital (Kyrgyzstan) and (2) 37 ICU patients’ agitation-sedation profiles collected by the Hunter Medical Research Institute (Australia). Both the traditional scoring approach and our copula-based approach will be evaluated using the Brier score to indicate overall model performance, the concordance (or c) statistic to indicate the discriminative ability (or area under the receiver operating characteristic (ROC) curve), and goodness-of-fit statistics for calibration. We will also report discrimination and calibration values and establish visualization of the copulas and high dimensional regions of risk interrelating two or three vital signs in so-called higher dimensional ROCs.

Keywords: copula, intensive unit scoring system, ROC curves, vital sign dependence

Procedia PDF Downloads 148
12372 Integrated Navigation System Using Simplified Kalman Filter Algorithm

Authors: Othman Maklouf, Abdunnaser Tresh

Abstract:

GPS and inertial navigation system (INS) have complementary qualities that make them ideal use for sensor fusion. The limitations of GPS include occasional high noise content, outages when satellite signals are blocked, interference and low bandwidth. The strengths of GPS include its long-term stability and its capacity to function as a stand-alone navigation system. In contrast, INS is not subject to interference or outages, have high bandwidth and good short-term noise characteristics, but have long-term drift errors and require external information for initialization. A combined system of GPS and INS subsystems can exhibit the robustness, higher bandwidth and better noise characteristics of the inertial system with the long-term stability of GPS. The most common estimation algorithm used in integrated INS/GPS is the Kalman Filter (KF). KF is able to take advantages of these characteristics to provide a common integrated navigation implementation with performance superior to that of either subsystem (GPS or INS). This paper presents a simplified KF algorithm for land vehicle navigation application. In this integration scheme, the GPS derived positions and velocities are used as the update measurements for the INS derived PVA. The KF error state vector in this case includes the navigation parameters as well as the accelerometer and gyroscope error states.

Keywords: GPS, INS, Kalman filter, inertial navigation system

Procedia PDF Downloads 466
12371 Improved Blood Glucose-Insulin Monitoring with Dual-Layer Predictive Control Design

Authors: Vahid Nademi

Abstract:

In response to widely used wearable medical devices equipped with a continuous glucose monitor (CGM) and insulin pump, the advanced control methods are still demanding to get the full benefit of these devices. Unlike costly clinical trials, implementing effective insulin-glucose control strategies can provide significant contributions to the patients suffering from chronic diseases such as diabetes. This study deals with a key role of two-layer insulin-glucose regulator based on model-predictive-control (MPC) scheme so that the patient’s predicted glucose profile is in compliance with the insulin level injected through insulin pump automatically. It is achieved by iterative optimization algorithm which is called an integrated perturbation analysis and sequential quadratic programming (IPA-SQP) solver for handling uncertainties due to unexpected variations in glucose-insulin values and body’s characteristics. The feasibility evaluation of the discussed control approach is also studied by means of numerical simulations of two case scenarios via measured data. The obtained results are presented to verify the superior and reliable performance of the proposed control scheme with no negative impact on patient safety.

Keywords: blood glucose monitoring, insulin pump, predictive control, optimization

Procedia PDF Downloads 133
12370 Data Collection in Protected Agriculture for Subsequent Big Data Analysis: Methodological Evaluation in Venezuela

Authors: Maria Antonieta Erna Castillo Holly

Abstract:

During the last decade, data analysis, strategic decision making, and the use of artificial intelligence (AI) tools in Latin American agriculture have been a challenge. In some countries, the availability, quality, and reliability of historical data, in addition to the current data recording methodology in the field, makes it difficult to use information systems, complete data analysis, and their support for making the right strategic decisions. This is something essential in Agriculture 4.0. where the increase in the global demand for fresh agricultural products of tropical origin, during all the seasons of the year requires a change in the production model and greater agility in the responses to the consumer market demands of quality, quantity, traceability, and sustainability –that means extensive data-. Having quality information available and updated in real-time on what, how much, how, when, where, at what cost, and the compliance with production quality standards represents the greatest challenge for sustainable and profitable agriculture in the region. The objective of this work is to present a methodological proposal for the collection of georeferenced data from the protected agriculture sector, specifically in production units (UP) with tall structures (Greenhouses), initially for Venezuela, taking the state of Mérida as the geographical framework, and horticultural products as target crops. The document presents some background information and explains the methodology and tools used in the 3 phases of the work: diagnosis, data collection, and analysis. As a result, an evaluation of the process is carried out, relevant data and dashboards are displayed, and the first satellite maps integrated with layers of information in a geographic information system are presented. Finally, some improvement proposals and tentatively recommended applications are added to the process, understanding that their objective is to provide better qualified and traceable georeferenced data for subsequent analysis of the information and more agile and accurate strategic decision making. One of the main points of this study is the lack of quality data treatment in the Latin America area and especially in the Caribbean basin, being one of the most important points how to manage the lack of complete official data. The methodology has been tested with horticultural products, but it can be extended to other tropical crops.

Keywords: greenhouses, protected agriculture, data analysis, geographic information systems, Venezuela

Procedia PDF Downloads 126
12369 Application of Generalized Taguchi and Design of Experiment Methodology for Rebar Production at an Integrated Steel Plant

Authors: S. B. V. S. P. Sastry, V. V. S. Kesava Rao

Abstract:

In this paper, x-ray impact of Taguchi method and design of experiment philosophy to project relationship between various factors leading to output yield strength of rebar is studied. In bar mill of an integrated steel plant, there are two production lines called as line 1 and line 2. The metallic properties e.g. yield strength of finished product of the same material is varying for a particular grade material when rolled simultaneously in both the lines. A study has been carried out to set the process parameters at optimal level for obtaining equal value of yield strength simultaneously for both lines.

Keywords: bar mill, design of experiment, taguchi, yield strength

Procedia PDF Downloads 238
12368 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model 1: Description

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies.

Keywords: runoff, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 371
12367 Validation of Visibility Data from Road Weather Information Systems by Comparing Three Data Resources: Case Study in Ohio

Authors: Fan Ye

Abstract:

Adverse weather conditions, particularly those with low visibility, are critical to the driving tasks. However, the direct relationship between visibility distances and traffic flow/roadway safety is uncertain due to the limitation of visibility data availability. The recent growth of deployment of Road Weather Information Systems (RWIS) makes segment-specific visibility information available which can be integrated with other Intelligent Transportation System, such as automated warning system and variable speed limit, to improve mobility and safety. Before applying the RWIS visibility measurements in traffic study and operations, it is critical to validate the data. Therefore, an attempt was made in the paper to examine the validity and viability of RWIS visibility data by comparing visibility measurements among RWIS, airport weather stations, and weather information recorded by police in crash reports, based on Ohio data. The results indicated that RWIS visibility measurements were significantly different from airport visibility data in Ohio, but no conclusion regarding the reliability of RWIS visibility could be drawn in the consideration of no verified ground truth in the comparisons. It was suggested that more objective methods are needed to validate the RWIS visibility measurements, such as continuous in-field measurements associated with various weather events using calibrated visibility sensors.

Keywords: RWIS, visibility distance, low visibility, adverse weather

Procedia PDF Downloads 243
12366 Facilitators and Barriers of Family Resilience in Cancer Patients Based on the Theoretical Domains Framework: An Integrative Review

Authors: Jiang Yuqi

Abstract:

Aims: The aim is to analyze the facilitators and barriers of family resilience in cancer patients based on the theoretical domain framework, provide a basis for intervention in the family resilience of cancer patients, and identify the progress and enlightenment of existing intervention projects. Methods: NVivo software was used to code the influencing factors using the framework of 14 theoretical domains as primary nodes; secondary nodes were then refined using thematic analysis, and specific influencing factors were aggregated and analyzed for evaluator reliability. Data sources: PubMed, Embase, CINAHL, Web of Science, Cochrane Library, MEDLINE, CNKI, and Wanfang (search dates: from construction to November 2023). Results: A total of 35 papers were included, with 142 coding points across 14 theoretical domains and 38 secondary nodes. The three most relevant theoretical domains are social influences (norms), the environment and resources, and emotions (mood). The factors with the greatest impact were family support, mood, confidence and beliefs, external support, quality of life, economic circumstances, family adaptation, coping styles with illness, and management. Conclusion: The factors influencing family resilience in cancer patients cover most of the theoretical domains in the Theoretical Domains Framework and are cross-cutting, multi-sourced, and complex. Further in-depth exploration of the key factors influencing family resilience is necessary to provide a basis for intervention research.

Keywords: cancer, survivors, family resilience, theoretical domains framework, literature review

Procedia PDF Downloads 43
12365 Assessing Social Vulnerability and Policy Adaption Application Responses Based on Landslide Risk Map

Authors: Z. A. Ahmad, R. C. Omar, I. Z. Baharuddin, R. Roslan

Abstract:

Assessments of social vulnerability, carried out holistically, can provide an important guide to the planning process and to decisions on resource allocation at various levels, and can help to raise public awareness of geo-hazard risks. The assessments can help to provide answers for basic questions such as the human vulnerability at the geo-hazard prone or disaster areas causing health damage, economic loss, loss of natural heritage and vulnerability impact of extreme natural hazard event. To overcome these issues, integrated framework for assessing the increasing human vulnerability to environmental changes caused by geo-hazards will be introduced using an indicator from landslide risk map that is related to agent based modeling platform. The indicators represent the underlying factors, which influence a community’s ability to deal with and recover from the damage associated with geo-hazards. Scope of this paper is particularly limited to landslides.

Keywords: social, vulnerability, geo-hazard, methodology, indicators

Procedia PDF Downloads 281
12364 Nano Fat Injection for Scar Treatment and Skin Rejuvenation

Authors: Sokol Isaraj, Lorela Bendo

Abstract:

Scars resulting from surgery, injury, or burns have a physical and psychological impact on the affected patient. Although a number of treatments are available, nano fat grafting is an effective treatment for scars. Nano fat is a liquid suspension rich in stem cells obtained by mechanical emulsification. Nano fat grafting was performed in 10 cases to correct rhytides, surgical scars, and post-burn scars between January 2022 and April 2022. Fat was aspirated from the lower abdomen or trochanteric region. After emulsification and filtration protocol, the resulting nano fat liquid was injected intradermally and subdermally. All patients filled out a questionnaire at three months post-treatment, which consisted of questions regarding the grade of improvement of skin and recommendation of the procedure. The clinical results were apparent between 2 and 3 weeks after the treatment. All patients confirmed an improvement in skin texture and quality. The most significant improvement was seen in pigmentation and pliability. No complications were reported. Nano fat seems to be a safe and effective treatment in scar treatment and skin rejuvenation.

Keywords: fat grafting, fat transfer, micro fat, nano fat

Procedia PDF Downloads 80