Search results for: regularization parameter search
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3876

Search results for: regularization parameter search

3066 Simulation of GAG-Analogue Biomimetics for Intervertebral Disc Repair

Authors: Dafna Knani, Sarit S. Sivan

Abstract:

Aggrecan, one of the main components of the intervertebral disc (IVD), belongs to the family of proteoglycans (PGs) that are composed of glycosaminoglycan (GAG) chains covalently attached to a core protein. Its primary function is to maintain tissue hydration and hence disc height under the high loads imposed by muscle activity and body weight. Significant PG loss is one of the first indications of disc degeneration. A possible solution to recover disc functions is by injecting a synthetic hydrogel into the joint cavity, hence mimicking the role of PGs. One of the hydrogels proposed is GAG-analogues, based on sulfate-containing polymers, which are responsible for hydration in disc tissue. In the present work, we used molecular dynamics (MD) to study the effect of the hydrogel crosslinking (type and degree) on the swelling behavior of the suggested GAG-analogue biomimetics by calculation of cohesive energy density (CED), solubility parameter, enthalpy of mixing (ΔEmix) and the interactions between the molecules at the pure form and as a mixture with water. The simulation results showed that hydrophobicity plays an important role in the swelling of the hydrogel, as indicated by the linear correlation observed between solubility parameter values of the copolymers and crosslinker weight ratio (w/w); this correlation was found useful in predicting the amount of PEGDA needed for the desirable hydration behavior of (CS)₄-peptide. Enthalpy of mixing calculations showed that all the GAG analogs, (CS)₄ and (CS)₄-peptide are water-soluble; radial distribution function analysis revealed that they form interactions with water molecules, which is important for the hydration process. To conclude, our simulation results, beyond supporting the experimental data, can be used as a useful predictive tool in the future development of biomaterials, such as disc replacement.

Keywords: molecular dynamics, proteoglycans, enthalpy of mixing, swelling

Procedia PDF Downloads 56
3065 Efficacy of Celecoxib Adjunct Treatment on Bipolar Disorder: Systematic Review and Meta-Analysis

Authors: Daniela V. Bavaresco, Tamy Colonetti, Antonio Jose Grande, Francesc Colom, Joao Quevedo, Samira S. Valvassori, Maria Ines da Rosa

Abstract:

Objective: Performed a systematic review and meta-analysis to evaluated the potential effect of the cyclo-oxygenases (Cox)-2 inhibitor Celecoxib adjunct treatment in Bipolar Disorder (BD), through of randomized controlled trials. Method: A search of the electronic databases was proceeded, on MEDLINE, EMBASE, Scopus, Cochrane Central Register of Controlled Trials (CENTRAL), Biomed Central, Web of Science, IBECS, LILACS, PsycINFO (American Psychological Association), Congress Abstracts, and Grey literature (Google Scholar and the British Library) for studies published from January 1990 to February 2018. A search strategy was developed using the terms: 'Bipolar disorder' or 'Bipolar mania' or 'Bipolar depression' or 'Bipolar mixed' or 'Bipolar euthymic' and 'Celecoxib' or 'Cyclooxygenase-2 inhibitors' or 'Cox-2 inhibitors' as text words and Medical Subject Headings (i.e., MeSH and EMTREE) and searched. The therapeutic effects of adjunctive treatment with Celecoxib were analyzed, it was possible to carry out a meta-analysis of three studies included in the systematic review. The meta-analysis was performed including the final results of the Young Mania Rating Scale (YMRS) at the end of randomized controlled trials (RCT). Results: Three primary studies were included in the systematic review, with a total of 121 patients. The meta-analysis had significant effect in the YMRS scores from patients with BD who used Celecoxib adjuvant treatment in comparison to placebo. The weighted mean difference was 5.54 (95%CI=3.26-7.82); p < 0.001; I2 =0%). Conclusion: The systematic review suggests that adjuvant treatment with Celecoxib improves the response of major treatments in patients with BD when compared with adjuvant placebo treatment.

Keywords: bipolar disorder, Cox-2 inhibitors, Celecoxib, systematic review, meta-analysis

Procedia PDF Downloads 474
3064 Evaluation of River Meander Geometry Using Uniform Excess Energy Theory and Effects of Climate Change on River Meandering

Authors: Youssef I. Hafez

Abstract:

Since ancient history rivers have been the fostering and favorite place for people and civilizations to live and exist along river banks. However, due to floods and droughts, especially sever conditions due to global warming and climate change, river channels are completely evolving and moving in the lateral direction changing their plan form either through straightening of curved reaches (meander cut-off) or increasing meandering curvature. The lateral shift or shrink of a river channel affects severely the river banks and the flood plain with tremendous impact on the surrounding environment. Therefore, understanding the formation and the continual processes of river channel meandering is of paramount importance. So far, in spite of the huge number of publications about river-meandering, there has not been a satisfactory theory or approach that provides a clear explanation of the formation of river meanders and the mechanics of their associated geometries. In particular two parameters are often needed to describe meander geometry. The first one is a scale parameter such as the meander arc length. The second is a shape parameter such as the maximum angle a meander path makes with the channel mean down path direction. These two parameters, if known, can determine the meander path and geometry as for example when they are incorporated in the well known sine-generated curve. In this study, a uniform excess energy theory is used to illustrate the origin and mechanics of formation of river meandering. This theory advocates that the longitudinal imbalance between the valley and channel slopes (with the former is greater than the second) leads to formation of curved meander channel in order to reduce the excess energy through its expenditure as transverse energy loss. Two relations are developed based on this theory; one for the determination of river channel radius of curvature at the bend apex (shape parameter) and the other for the determination of river channel sinuosity. The sinuosity equation tested very well when applied to existing available field data. In addition, existing model data were used to develop a relation between the meander arc length and the Darcy-Weisback friction factor. Then, the meander wave length was determined from the equations of the arc length and the sinuosity. The developed equation compared well with available field data. Effects of the transverse bed slope and grain size on river channel sinuosity are addressed. In addition, the concept of maximum channel sinuosity is introduced in order to explain the changes of river channel plan form due to changes in flow discharges and sediment loads induced by global warming and climate changes.

Keywords: river channel meandering, sinuosity, radius of curvature, meander arc length, uniform excess energy theory, transverse energy loss, transverse bed slope, flow discharges, sediment loads, grain size, climate change, global warming

Procedia PDF Downloads 210
3063 From Scalpel to Leadership: The Landscape for Female Neurosurgeons in the UK

Authors: Anda-veronica Gherman, Dimitrios Varthalitis

Abstract:

Neurosurgery, like many surgical specialties, undoubtedly exhibits a significant gender gap, particularly in leadership positions. While increasing women representation in neurosurgery is important, it is crucial to increase their presence in leadership positions. Across the globe and Europe there are concerning trends of only 4% of all neurosurgical departments being chaired by women. This study aims to explore the situation regarding gender disparities in leadership in the United Kingdom and to identify possible contributing factors as well as discussing future strategies to bridge this gap. Methods: A literature review was conducted utilising PubMed as main database with search keywords including ‘female neurosurgeon’, ‘women neurosurgeon’, ‘gender disparity’, ‘leadership’ and ‘UK’. Additionally, a manual search of all neurosurgical departments in the UK was performed to identify the current female department leads and training director leads. Results: The literature search identified a paucity of literature addressing specifically leadership in female neurosurgeons within the UK, with very few published papers specifically on this topic. Despite more than half of medical students in the UK being female, only a small proportion pursue a surgical career, with neurosurgery being one of the least represented specialties. Only 27% of trainee neurosurgeons are female, and numbers are even lower at a consultant level, where women represent just 8%.Findings from published studies indicated that only 6.6% of leadership positions in neurosurgery are occupied by women in the UK. Furthermore, our manual searches across UK neurosurgical departments revealed that around 5% of department lead positions are currently held by women. While this figure is slightly higher than the European average of 4%, it remains lower compared to figures of 10% in other North-West European countries. The situation is slightly more positive looking at the training directors, with 15% being female. Discussion: The findings of this study highlight a significant gender disparity in leadership positions within neurosurgery in the UK, which may have important implications, perpetuating the lack of diversity on the decision-making process, limiting the career advancement opportunities of women and depriving the neurosurgical field from the voices, opinions and talents of women. With women representing half of the population, there is an undeniable need for more female leaders at the policy-making level. There are many barriers that can contribute to these numbers, including bias, stereotypes, lack of mentorship and work-like balance. A few solutions to overcome these barriers can be training programs addressing bias and impostor syndrome, leadership workshops tailored for female needs, better workplace policies, increased in formal mentorship and increasing the visibility of women in neurosurgery leadership positions through media, speaking opportunities, conferences, awards etc. And lastly, more research efforts should focus on the leadership and mentorship of women in neurosurgery, with an increased number of published papers discussing these issues.

Keywords: female neurosurgeons, female leadership, female mentorship, gender disparities

Procedia PDF Downloads 11
3062 Make Populism Great Again: Identity Crisis in Western World with a Narrative Analysis of Donald Trump's Presidential Campaign Announcement Speech

Authors: Soumi Banerjee

Abstract:

In this research paper we will go deep into understanding Benedict Anderson’s definition of the nation as an imagined community and we will analyze why and how national identities were created through long and complex processes, and how there can exist strong emotional bonds between people within an imagined community, given the fact that these people have never known each other personally, but will still feel some form of imagined unity. Such identity construction on the part of an individual or within societies are always in some sense in a state of flux as imagined communities are ever changing, which provides us with the ontological foundation for reaching on this paper. This sort of identity crisis among individuals living in the Western world, who are in search for psychological comfort and security, illustrates a possible need for spatially dislocated, ontologically insecure and vulnerable individuals to have a secure identity. To create such an identity there has to be something to build upon, which could be achieved through what may be termed as ‘homesteading’. This could in short, and in my interpretation of Kinnvall and Nesbitt’s concept, be described as a search for security that involves a search for ‘home’, where home acts as a secure place, which one can build an identity around. The next half of the paper will then look into how populism and identity have played an increasingly important role in the political elections in the so-called western democracies of the world, using the U.S. as an example. Notions of ‘us and them’, the people and the elites will be looked into and analyzed through a social constructivist theoretical lens. Here we will analyze how such narratives about identity and the nation state affects people, their personality development and identity in different ways by studying the U.S. President Donald Trump’s speeches and analyze if and how he used different identity creating narratives for gaining political and popular support. The reason to choose narrative analysis as a method in this research paper is to use the narratives as a device to understand how the perceived notions of 'us and them' can initiate huge identity crisis with a community or a nation-state. This is a relevant subject as results and developments such as rising populist rightwing movements are being felt in a number of European states, with the so-called Brexit vote in the U.K. and the election of Donald Trump as president are two of the prime examples. This paper will then attempt to argue that these mechanisms are strengthened and gaining significance in situations when humans in an economic, social or ontologically vulnerable position, imagined or otherwise, in a general and broad meaning perceive themselves to be under pressure, and a sense of insecurity is rising. These insecurities and sense of being under threat have been on the rise in many of the Western states that are otherwise usually perceived to be some of the safest, democratically stable and prosperous states in the world, which makes it of interest to study what has changed, and help provide some part of the explanation as to how creating a ‘them’ in the discourse of national identity can cause massive security crisis.

Keywords: identity crisis, migration, ontological security(in), nation-states

Procedia PDF Downloads 238
3061 Aerodynamic Design and Optimization of Vertical Take-Off and Landing Type Unmanned Aerial Vehicles

Authors: Enes Gunaltili, Burak Dam

Abstract:

The airplane history started with the Wright brothers' aircraft and improved day by day. With the help of this advancements, big aircrafts replace with small and unmanned air vehicles, so in this study we design this type of air vehicles. First of all, aircrafts mainly divided into two main parts in our day as a rotary and fixed wing aircrafts. The fixed wing aircraft generally use for transport, cargo, military and etc. The rotary wing aircrafts use for same area but there are some superiorities from each other. The rotary wing aircraft can take off vertically from the ground, and it can use restricted area. On the other hand, rotary wing aircrafts generally can fly lower range than fixed wing aircraft. There are one kind of aircraft consist of this two types specifications. It is named as VTOL (vertical take-off and landing) type aircraft. VTOLs are able to takeoff and land vertically and fly horizontally. The VTOL aircrafts generally can fly higher range from the rotary wings but can fly lower range from the fixed wing aircraft but it gives beneficial range between them. There are many other advantages of VTOL aircraft from the rotary and fixed wing aircraft. Because of that, VTOLs began to use for generally military, cargo, search, rescue and mapping areas. Within this framework, this study answers the question that how can we design VTOL as a small unmanned aircraft systems for search and rescue application for benefiting the advantages of fixed wing and rotary wing aircrafts by eliminating the disadvantages of them. To answer that question and design VTOL aircraft, multidisciplinary design optimizations (MDO), some theoretical terminologies, formulations, simulations and modelling systems based on CFD (Computational Fluid Dynamics) is used in same time as design methodology to determine design parameters and steps. As a conclusion, based on tests and simulations depend on design steps, suggestions on how the VTOL aircraft designed and advantages, disadvantages, and observations for design parameters are listed, then VTOL is designed and presented with the design parameters, advantages, and usage areas.

Keywords: airplane, rotary, fixed, VTOL, CFD

Procedia PDF Downloads 269
3060 Influence of Preparation, Characterisation and Application of Carbon Nano Tube

Authors: Dhaivat S. Soni, Snehal Thakor, Afroz Bhatti

Abstract:

The prepare CNTs in bulk quantity by as easiest as possible method with highly pure and small diameter. Prepared CNTs first charactered its structural parameter for the conformation of CNTs and purity. Surface morphology of CNTs stured by using various instruments finally study application of prepared CNTs in various field. Carbon nanotubes (CNTs) were synthesized in large scale by pyrolyzing activated carbon in sealed autoclaves.

Keywords: nanostructures, nanotubes, carbon, pyrolysis

Procedia PDF Downloads 381
3059 Development of a Multi-Locus DNA Metabarcoding Method for Endangered Animal Species Identification

Authors: Meimei Shi

Abstract:

Objectives: The identification of endangered species, especially simultaneous detection of multiple species in complex samples, plays a critical role in alleged wildlife crime incidents and prevents illegal trade. This study was to develop a multi-locus DNA metabarcoding method for endangered animal species identification. Methods: Several pairs of universal primers were designed according to the mitochondria conserved gene regions. Experimental mixtures were artificially prepared by mixing well-defined species, including endangered species, e.g., forest musk, bear, tiger, pangolin, and sika deer. The artificial samples were prepared with 1-16 well-characterized species at 1% to 100% DNA concentrations. After multiplex-PCR amplification and parameter modification, the amplified products were analyzed by capillary electrophoresis and used for NGS library preparation. The DNA metabarcoding was carried out based on Illumina MiSeq amplicon sequencing. The data was processed with quality trimming, reads filtering, and OTU clustering; representative sequences were blasted using BLASTn. Results: According to the parameter modification and multiplex-PCR amplification results, five primer sets targeting COI, Cytb, 12S, and 16S, respectively, were selected as the NGS library amplification primer panel. High-throughput sequencing data analysis showed that the established multi-locus DNA metabarcoding method was sensitive and could accurately identify all species in artificial mixtures, including endangered animal species Moschus berezovskii, Ursus thibetanus, Panthera tigris, Manis pentadactyla, Cervus nippon at 1% (DNA concentration). In conclusion, the established species identification method provides technical support for customs and forensic scientists to prevent the illegal trade of endangered animals and their products.

Keywords: DNA metabarcoding, endangered animal species, mitochondria nucleic acid, multi-locus

Procedia PDF Downloads 120
3058 Agile Software Effort Estimation Using Regression Techniques

Authors: Mikiyas Adugna

Abstract:

Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.

Keywords: agile software development, effort estimation, elastic net regression, LASSO

Procedia PDF Downloads 43
3057 Transforming Educational Leadership With Innovative Administrative Strategies

Authors: Kofi Nkonkonya Mpuangnan, Samantha Govender, Hlengiwe Romualda Mhlongo

Abstract:

Educational leaders are skilled architects crafting a vibrant environment where growth, creativity, and adaptability can flourish within schools. Their journey is one of transformation, urging them to explore administrative strategies that align seamlessly with evolving educational models and cater to the specific needs of students, educators, and stakeholders. Through this committed effort to innovate, they seek to enhance the effectiveness and influence of educational systems, paving the way for a more inclusive and forward-thinking educational environment. In this context, the authors explored the concept of transforming educational leadership with administrative strategies in alignment with the following research objectives. To find the strategies that can be adopted by transformation leaders to promote effective administrative practices in an educational setting and to explore the roles of educational leaders in promoting collaboration in education. To find answers to these questions, a systematic literature review underpinned by the transformational leadership model was adopted. Therefore, concepts integrated from a variety of outlets, including academic journals, conference proceedings, and reports found within SCOPUS, WoS, and IBSS databases. A search was aided using specific themes like innovative administrative practices, the roles of educational leaders, and interdisciplinary approaches to administrative practices. The process of conducting the search adhered to the five-step framework, which was subjected to inclusion and exclusion of studies. It was found that transformational leadership, agile methodologies, employee wellbeing, seminars and workshops could foster a culture of innovation and creativity among teachers and staff to transform administrative practices in education settings. It was recommended that professional development programs be organized periodically for educational leaders in educational institutions to help them revitalize their knowledge and skills in educational administration.

Keywords: educational leadership, innovative strategies, administrative practices, professional development, stakeholder engaement, student outcome

Procedia PDF Downloads 67
3056 Evaluation of Surface Roughness Condition Using App Roadroid

Authors: Diego de Almeida Pereira

Abstract:

The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.

Keywords: roadroid, international roughness index, Brazilian roads, pavement

Procedia PDF Downloads 72
3055 Motif Search-Aided Screening of the Pseudomonas syringae pv. Maculicola Genome for Genes Encoding Tertiary Alcohol Ester Hydrolases

Authors: M. L. Mangena, N. Mokoena, K. Rashamuse, M. G. Tlou

Abstract:

Tertiary alcohol ester (TAE) hydrolases are a group of esterases (EC 3.1.1.-) that catalyze the kinetic resolution of TAEs and as a result, they are sought-after for the production of optically pure tertiary alcohols (TAs) which are useful as building blocks for number biologically active compounds. What sets these enzymes apart is, the presence of a GGG(A)X-motif in the active site which appears to be the main reason behind their activity towards the sterically demanding TAEs. The genome of Pseudomonas syringae pv. maculicola (Psm) comprises a multitude of genes that encode esterases. We therefore, hypothesize that some of these genes encode TAE hydrolases. In this study, Psm was screened for TAE hydrolase activity using the linalyl acetate (LA) plate assay and a positive reaction was observed. As a result, the genome of Psm was screened for esterases with a GGG(A)X-motif using the motif search tool and two potential TAE hydrolase genes (PsmEST1 and 2, 1100 and 1000bp, respectively) were identified, PsmEST1 was amplified by PCR and the gene sequenced for confirmation. Analysis of the sequence data with the SingnalP 4.1 server revealed that the protein comprises a signal peptide (22 amino acid residues) on the N-terminus. Primers specific for the gene encoding the mature protein (without the signal peptide) were designed such that they contain NdeI and XhoI restriction sites for directional cloning of the PCR products into pET28a. The gene was expressed in E. coli JM109 (DE3) and the clones screened for TAE hydrolase activity using the LA plate assay. A positive clone was selected, overexpressed and the protein purified using nickel affinity chromatography. The activity of the esterase towards LA was confirmed using thin layer chromatography.

Keywords: hydrolases, tertiary alcohol esters, tertiary alcohols, screening, Pseudomonas syringae pv., maculicola genome, esterase activity, linalyl acetate

Procedia PDF Downloads 342
3054 A Demonstration of How to Employ and Interpret Binary IRT Models Using the New IRT Procedure in SAS 9.4

Authors: Ryan A. Black, Stacey A. McCaffrey

Abstract:

Over the past few decades, great strides have been made towards improving the science in the measurement of psychological constructs. Item Response Theory (IRT) has been the foundation upon which statistical models have been derived to increase both precision and accuracy in psychological measurement. These models are now being used widely to develop and refine tests intended to measure an individual's level of academic achievement, aptitude, and intelligence. Recently, the field of clinical psychology has adopted IRT models to measure psychopathological phenomena such as depression, anxiety, and addiction. Because advances in IRT measurement models are being made so rapidly across various fields, it has become quite challenging for psychologists and other behavioral scientists to keep abreast of the most recent developments, much less learn how to employ and decide which models are the most appropriate to use in their line of work. In the same vein, IRT measurement models vary greatly in complexity in several interrelated ways including but not limited to the number of item-specific parameters estimated in a given model, the function which links the expected response and the predictor, response option formats, as well as dimensionality. As a result, inferior methods (a.k.a. Classical Test Theory methods) continue to be employed in efforts to measure psychological constructs, despite evidence showing that IRT methods yield more precise and accurate measurement. To increase the use of IRT methods, this study endeavors to provide a comprehensive overview of binary IRT models; that is, measurement models employed on test data consisting of binary response options (e.g., correct/incorrect, true/false, agree/disagree). Specifically, this study will cover the most basic binary IRT model, known as the 1-parameter logistic (1-PL) model dating back to over 50 years ago, up until the most recent complex, 4-parameter logistic (4-PL) model. Binary IRT models will be defined mathematically and the interpretation of each parameter will be provided. Next, all four binary IRT models will be employed on two sets of data: 1. Simulated data of N=500,000 subjects who responded to four dichotomous items and 2. A pilot analysis of real-world data collected from a sample of approximately 770 subjects who responded to four self-report dichotomous items pertaining to emotional consequences to alcohol use. Real-world data were based on responses collected on items administered to subjects as part of a scale-development study (NIDA Grant No. R44 DA023322). IRT analyses conducted on both the simulated data and analyses of real-world pilot will provide a clear demonstration of how to construct, evaluate, and compare binary IRT measurement models. All analyses will be performed using the new IRT procedure in SAS 9.4. SAS code to generate simulated data and analyses will be available upon request to allow for replication of results.

Keywords: instrument development, item response theory, latent trait theory, psychometrics

Procedia PDF Downloads 336
3053 On a Negative Relation between Bacterial Taxis and Turing Pattern Formation

Authors: A. Elragig, S. Townley, H. Dreiwi

Abstract:

In this paper we introduce a bacteria-leukocyte model with bacteria chemotaxsis. We assume that bacteria develop a tactic defense mechanism as a response to Leukocyte phagocytosis. We explore the effect of this tactic motion on Turing space in two parameter spaces. A fine tuning of bacterial chemotaxis shows a significant effect on developing a non-uniform steady state.

Keywords: chemotaxis-diffusion driven instability, bacterial chemotaxis, mathematical biology, ecology

Procedia PDF Downloads 351
3052 Cross-Cultural Analysis of the Impact of Project Atmosphere on Project Success and Failure

Authors: Omer Livvarcin, Mary Kay Park, Michael Miles

Abstract:

The current literature includes a few studies that mention the impact of relations between teams, the business environment, and experiences from previous projects. There is, however, limited research that treats the phenomenon of project atmosphere (PA) as a whole. This is especially true of research identifying parameters and sub-parameters, which allow project management (PM) teams to build a project culture that ultimately imbues project success. This study’s findings identify a number of key project atmosphere parameters and sub-parameters that affect project management success. One key parameter identified in the study is a cluster related to cultural concurrence, including artifacts such as policies and mores, values, perceptions, and assumptions. A second cluster centers on motivational concurrence, including such elements as project goals and team-member expectations, moods, morale, motivation, and organizational support. A third parameter cluster relates to experiential concurrence, with a focus on project and organizational memory, previous internal PM experience, and external environmental PM history and experience). A final cluster of parameters is comprised of those falling in the area of relational concurrence, including inter/intragroup relationships, role conflicts, and trust. International and intercultural project management data was collected and analyzed from the following countries: Canada, China, Nigeria, South Korea and Turkey. The cross-cultural nature of the data set suggests increased confidence that the findings will be generalizable across cultures and thus applicable for future international project management success. The intent of the identification of project atmosphere as a critical project management element is that a clear understanding of the dynamics of its sub-parameters upon projects may significantly improve the odds of success of future international and intercultural projects.

Keywords: project management, project atmosphere, cultural concurrence, motivational concurrence, relational concurrence

Procedia PDF Downloads 303
3051 Role of Spatial Variability in the Service Life Prediction of Reinforced Concrete Bridges Affected by Corrosion

Authors: Omran M. Kenshel, Alan J. O'Connor

Abstract:

Estimating the service life of Reinforced Concrete (RC) bridge structures located in corrosive marine environments of a great importance to their owners/engineers. Traditionally, bridge owners/engineers relied more on subjective engineering judgment, e.g. visual inspection, in their estimation approach. However, because financial resources are often limited, rational calculation methods of estimation are needed to aid in making reliable and more accurate predictions for the service life of RC structures. This is in order to direct funds to bridges found to be the most critical. Criticality of the structure can be considered either form the Structural Capacity (i.e. Ultimate Limit State) or from Serviceability viewpoint whichever is adopted. This paper considers the service life of the structure only from the Structural Capacity viewpoint. Considering the great variability associated with the parameters involved in the estimation process, the probabilistic approach is most suited. The probabilistic modelling adopted here used Monte Carlo simulation technique to estimate the Reliability (i.e. Probability of Failure) of the structure under consideration. In this paper the authors used their own experimental data for the Correlation Length (CL) for the most important deterioration parameters. The CL is a parameter of the Correlation Function (CF) by which the spatial fluctuation of a certain deterioration parameter is described. The CL data used here were produced by analyzing 45 chloride profiles obtained from a 30 years old RC bridge located in a marine environment. The service life of the structure were predicted in terms of the load carrying capacity of an RC bridge beam girder. The analysis showed that the influence of SV is only evident if the reliability of the structure is governed by the Flexure failure rather than by the Shear failure.

Keywords: Chloride-induced corrosion, Monte-Carlo simulation, reinforced concrete, spatial variability

Procedia PDF Downloads 462
3050 Parameter Estimation of Additive Genetic and Unique Environment (AE) Model on Diabetes Mellitus Type 2 Using Bayesian Method

Authors: Andi Darmawan, Dewi Retno Sari Saputro, Purnami Widyaningsih

Abstract:

Diabetes mellitus (DM) is a chronic disease in human that occurred if pancreas cannot produce enough of insulin hormone or the body uses ineffectively insulin hormone which causes increasing level of glucose in the blood, or it was called hyperglycemia. In Indonesia, DM is a serious disease on health because it can cause blindness, kidney disease, diabetic feet (gangrene), and stroke. The type of DM criteria can also be divided based on the main causes; they are DM type 1, type 2, and gestational. Diabetes type 1 or previously known as insulin-independent diabetes is due to a lack of production of insulin hormone. Diabetes type 2 or previously known as non-insulin dependent diabetes is due to ineffective use of insulin while gestational diabetes is a hyperglycemia that found during pregnancy. The most one type commonly found in patient is DM type 2. The main factors of this disease are genetic (A) and life style (E). Those disease with 2 factors can be constructed with additive genetic and unique environment (AE) model. In this article was discussed parameter estimation of AE model using Bayesian method and the inheritance character simulation on parent-offspring. On the AE model, there are response variable, predictor variables, and parameters were capable of representing the number of population on research. The population can be measured through a taken random sample. The response and predictor variables can be determined by sample while the parameters are unknown, so it was required to estimate the parameters based on the sample. Estimation of AE model parameters was obtained based on a joint posterior distribution. The simulation was conducted to get the value of genetic variance and life style variance. The results of simulation are 0.3600 for genetic variance and 0.0899 for life style variance. Therefore, the variance of genetic factor in DM type 2 is greater than life style.

Keywords: AE model, Bayesian method, diabetes mellitus type 2, genetic, life style

Procedia PDF Downloads 264
3049 Analyzing the Effects of Bio-fibers on the Stiffness and Strength of Adhesively Bonded Thermoplastic Bio-fiber Reinforced Composites by a Mixed Experimental-Numerical Approach

Authors: Sofie Verstraete, Stijn Debruyne, Frederik Desplentere

Abstract:

Considering environmental issues, the interest to apply sustainable materials in industry increases. Specifically for composites, there is an emerging need for suitable materials and bonding techniques. As an alternative to traditional composites, short bio-fiber (cellulose-based flax) reinforced Polylactic Acid (PLA) is gaining popularity. However, these thermoplastic based composites show issues in adhesive bonding. This research focusses on analyzing the effects of the fibers near the bonding interphase. The research applies injection molded plate structures. A first important parameter concerns the fiber volume fraction, which directly affects adhesion characteristics of the surface. This parameter is varied between 0 (pure PLA) and 30%. Next to fiber volume fraction, the orientation of fibers near the bonding surface governs the adhesion characteristics of the injection molded parts. This parameter is not directly controlled in this work, but its effects are analyzed. Surface roughness also greatly determines surface wettability, thus adhesion. Therefore, this research work considers three different roughness conditions. Different mechanical treatments yield values up to 0.5 mm. In this preliminary research, only one adhesive type is considered. This is a two-part epoxy which is cured at 23 °C for 48 hours. In order to assure a dedicated parametric study, simple and reproduceable adhesive bonds are manufactured. Both single lap (substrate width 25 mm, thickness 3 mm, overlap length 10 mm) and double lap tests are considered since these are well documented and quite straightforward to conduct. These tests are conducted for the different substrate and surface conditions. Dog bone tensile testing is applied to retrieve the stiffness and strength characteristics of the substrates (with different fiber volume fractions). Numerical modelling (non-linear FEA) relates the effects of the considered parameters on the stiffness and strength of the different joints, obtained through the abovementioned tests. Ongoing work deals with developing dedicated numerical models, incorporating the different considered adhesion parameters. Although this work is the start of an extensive research project on the bonding characteristics of thermoplastic bio-fiber reinforced composites, some interesting results are already prominent. Firstly, a clear correlation between the surface roughness and the wettability of the substrates is observed. Given the adhesive type (and viscosity), it is noticed that an increase in surface energy is proportional to the surface roughness, to some extent. This becomes more pronounced when fiber volume fraction increases. Secondly, ultimate bond strength (single lap) also increases with increasing fiber volume fraction. On a macroscopic level, this confirms the positive effect of fibers near the adhesive bond line.

Keywords: adhesive bonding, bio-fiber reinforced composite, flax fibers, lap joint

Procedia PDF Downloads 112
3048 Clinical Efficacy of Nivolumab and Ipilimumab Combination Therapy for the Treatment of Advanced Melanoma: A Systematic Review and Meta-Analysis of Clinical Trials

Authors: Zhipeng Yan, Janice Wing-Tung Kwong, Ching-Lung Lai

Abstract:

Background: Advanced melanoma accounts for the majority of skin cancer death due to its poor prognosis. Nivolumab and ipilimumab are monoclonal antibodies targeting programmed cell death protein 1 (PD-1) and cytotoxic T-lymphocytes antigen 4 (CTLA-4). Nivolumab and ipilimumab combination therapy has been proven to be effective for advanced melanoma. This systematic review and meta-analysis are to evaluate its clinical efficacy and adverse events. Method: A systematic search was done on databases (Pubmed, Embase, Medline, Cochrane) on 21 June 2020. Search keywords were nivolumab, ipilimumab, melanoma, and randomised controlled trials. Clinical trials fulfilling the inclusion criteria were selected to evaluate the efficacy of combination therapy in terms of prolongation of progression-free survival (PFS), overall survival (OS), and objective response rate (ORR). The odd ratios and distributions of grade 3 or above adverse events were documented. Subgroup analysis was performed based on PD-L1 expression-status and BRAF-mutation status. Results: Compared with nivolumab monotherapy, the hazard ratios of PFS, OS and odd ratio of ORR in combination therapy were 0.64 (95% CI, 0.48-0.85; p=0.002), 0.84 (95% CI, 0.74-0.95; p=0.007) and 1.76 (95% CI, 1.51-2.06; p < 0.001), respectively. Compared with ipilimumab monotherapy, the hazard ratios of PFS, OS and odd ratio of ORR were 0.46 (95% CI, 0.37-0.57; p < 0.001), 0.54 (95% CI, 0.48-0.61; p < 0.001) and 6.18 (95% CI, 5.19-7.36; p < 0.001), respectively. In combination therapy, the odds ratios of grade 3 or above adverse events were 4.71 (95% CI, 3.57-6.22; p < 0.001) compared with nivolumab monotherapy, and 3.44 (95% CI, 2.49-4.74; p < 0.001) compared with ipilimumab monotherapy, respectively. High PD-L1 expression level and BRAF mutation were associated with better clinical outcomes in patients receiving combination therapy. Conclusion: Combination therapy is effective for the treatment of advanced melanoma. Adverse events were common but manageable. Better clinical outcomes were observed in patients with high PD-L1 expression levels and positive BRAF-mutation.

Keywords: nivolumab, ipilimumab, advanced melanoma, systematic review, meta-analysis

Procedia PDF Downloads 128
3047 Potential of Aerodynamic Feature on Monitoring Multilayer Rough Surfaces

Authors: Ibtissem Hosni, Lilia Bennaceur Farah, Saber Mohamed Naceur

Abstract:

In order to assess the water availability in the soil, it is crucial to have information about soil distributed moisture content; this parameter helps to understand the effect of humidity on the exchange between soil, plant cover and atmosphere in addition to fully understanding the surface processes and the hydrological cycle. On the other hand, aerodynamic roughness length is a surface parameter that scales the vertical profile of the horizontal component of the wind speed and characterizes the surface ability to absorb the momentum of the airflow. In numerous applications of the surface hydrology and meteorology, aerodynamic roughness length is an important parameter for estimating momentum, heat and mass exchange between the soil surface and atmosphere. It is important on this side, to consider the atmosphere factors impact in general, and the natural erosion in particular, in the process of soil evolution and its characterization and prediction of its physical parameters. The study of the induced movements by the wind over soil vegetated surface, either spaced plants or plant cover, is motivated by significant research efforts in agronomy and biology. The known major problem in this side concerns crop damage by wind, which presents a booming field of research. Obviously, most models of soil surface require information about the aerodynamic roughness length and its temporal and spatial variability. We have used a bi-dimensional multi-scale (2D MLS) roughness description where the surface is considered as a superposition of a finite number of one-dimensional Gaussian processes each one having a spatial scale using the wavelet transform and the Mallat algorithm to describe natural surface roughness. We have introduced multi-layer aspect of the humidity of the soil surface, to take into account a volume component in the problem of backscattering radar signal. As humidity increases, the dielectric constant of the soil-water mixture increases and this change is detected by microwave sensors. Nevertheless, many existing models in the field of radar imagery, cannot be applied directly on areas covered with vegetation due to the vegetation backscattering. Thus, the radar response corresponds to the combined signature of the vegetation layer and the layer of soil surface. Therefore, the key issue of the numerical estimation of soil moisture is to separate the two contributions and calculate both scattering behaviors of the two layers by defining the scattering of the vegetation and the soil blow. This paper presents a synergistic methodology, and it is for estimating roughness and soil moisture from C-band radar measurements. The methodology adequately represents a microwave/optical model which has been used to calculate the scattering behavior of the aerodynamic vegetation-covered area by defining the scattering of the vegetation and the soil below.

Keywords: aerodynamic, bi-dimensional, vegetation, synergistic

Procedia PDF Downloads 257
3046 Use of a Business Intelligence Software for Interactive Visualization of Data on the Swiss Elite Sports System

Authors: Corinne Zurmuehle, Andreas Christoph Weber

Abstract:

In 2019, the Swiss Federal Institute of Sport Magglingen (SFISM) conducted a mixed-methods study on the Swiss elite sports system, which yielded a large quantity of research data. In a quantitative online survey, 1151 elite sports athletes, 542 coaches, and 102 Performance Directors of national sports federations (NF) have submitted their perceptions of the national support measures of the Swiss elite sports system. These data provide an essential database for the further development of the Swiss elite sports system. The results were published in a report presenting the results divided into 40 Olympic summer and 14 winter sports (Olympic classification). The authors of this paper assume that, in practice, this division is too unspecific to assess where further measures would be needed. The aim of this paper is to find appropriate parameters for data visualization in order to identify disparities in sports promotion that allow an assessment of where further interventions by Swiss Olympic (NF umbrella organization) are required. Method: First, the variable 'salary earned from sport' was defined as a variable to measure the impact of elite sports promotion. This variable was chosen as a measure as it represents an important indicator for the professionalization of elite athletes and therefore reflects national level sports promotion measures applied by Swiss Olympic. Afterwards, the variable salary was tested with regard to the correlation between Olympic classification [a], calculating the Eta coefficient. To estimate the appropriate parameters for data visualization, the correlation between salary and four further parameters was analyzed by calculating the Eta coefficient: [a] sport; [b] prioritization (from 1 to 5) of the sports by Swiss Olympic; [c] gender; [d] employment level in sports. Results & Discussion: The analyses reveal a very small correlation between salary and Olympic classification (ɳ² = .011, p = .005). Gender demonstrates an even small correlation (ɳ² = .006, p = .014). The parameter prioritization was correlating with small effect (ɳ² = .017, p = .001) as did employment level (ɳ² = .028, p < .001). The highest correlation was identified by the parameter sport with a moderate effect (ɳ² = .075, p = .047). The analyses show that the disparities in sports promotion cannot be determined by a particular parameter but presumably explained by a combination of several parameters. We argue that the possibility of combining parameters for data visualization should be enabled when the analysis is provided to Swiss Olympic for further strategic decision-making. However, the inclusion of multiple parameters massively multiplies the number of graphs and is therefore not suitable for practical use. Therefore, we suggest to apply interactive dashboards for data visualization using Business Intelligence Software. Practical & Theoretical Contribution: This contribution provides the first attempt to use Business Intelligence Software for strategic decision-making in national level sports regarding the prioritization of national resources for sports and athletes. This allows to set specific parameters with a significant effect as filters. By using filters, parameters can be combined and compared against each other and set individually for each strategic decision.

Keywords: data visualization, business intelligence, Swiss elite sports system, strategic decision-making

Procedia PDF Downloads 77
3045 Terahertz Glucose Sensors Based on Photonic Crystal Pillar Array

Authors: S. S. Sree Sanker, K. N. Madhusoodanan

Abstract:

Optical biosensors are dominant alternative for traditional analytical methods, because of their small size, simple design and high sensitivity. Photonic sensing method is one of the recent advancing technology for biosensors. It measures the change in refractive index which is induced by the difference in molecular interactions due to the change in concentration of the analyte. Glucose is an aldosic monosaccharide, which is a metabolic source in many of the organisms. The terahertz waves occupies the space between infrared and microwaves in the electromagnetic spectrum. Terahertz waves are expected to be applied to various types of sensors for detecting harmful substances in blood, cancer cells in skin and micro bacteria in vegetables. We have designed glucose sensors using silicon based 1D and 2D photonic crystal pillar arrays in terahertz frequency range. 1D photonic crystal has rectangular pillars with height 100 µm, length 1600 µm and width 50 µm. The array period of the crystal is 500 µm. 2D photonic crystal has 5×5 cylindrical pillar array with an array period of 75 µm. Height and diameter of the pillar array are 160 µm and 100 µm respectively. Two samples considered in the work are blood and glucose solution, which are labelled as sample 1 and sample 2 respectively. The proposed sensor detects the concentration of glucose in the samples from 0 to 100 mg/dL. For this, the crystal was irradiated with 0.3 to 3 THz waves. By analyzing the obtained S parameter, the refractive index of the crystal corresponding to the particular concentration of glucose was measured using the parameter retrieval method. Refractive indices of the two crystals decreased gradually with the increase in concentration of glucose in the sample. For 1D photonic crystals, a gradual decrease in refractive index was observed at 1 THz. 2D photonic crystal showed this behavior at 2 THz. The proposed sensor was simulated using CST Microwave studio. This will enable us to develop a model which can be used to characterize a glucose sensor. The present study is expected to contribute to blood glucose monitoring.

Keywords: CST microwave studio, glucose sensor, photonic crystal, terahertz waves

Procedia PDF Downloads 263
3044 Investigation of Ductile Failure Mechanisms in SA508 Grade 3 Steel via X-Ray Computed Tomography and Fractography Analysis

Authors: Suleyman Karabal, Timothy L. Burnett, Egemen Avcu, Andrew H. Sherry, Philip J. Withers

Abstract:

SA508 Grade 3 steel is widely used in the construction of nuclear pressure vessels, where its fracture toughness plays a critical role in ensuring operational safety and reliability. Understanding the ductile failure mechanisms in this steel grade is crucial for designing robust pressure vessels that can withstand severe nuclear environment conditions. In the present study, round bar specimens of SA508 Grade 3 steel with four distinct notch geometries were subjected to tensile loading while capturing continuous 2D images at 5-second intervals in order to monitor any alterations in their geometries to construct true stress-strain curves of the specimens. 3D reconstructions of X-ray computed tomography (CT) images at high-resolution (a spatial resolution of 0.82 μm) allowed for a comprehensive assessment of the influences of second-phase particles (i.e., manganese sulfide inclusions and cementite particles) on ductile failure initiation as a function of applied plastic strain. Additionally, based on 2D and 3D images, plasticity modeling was executed, and the results were compared to experimental data. A specific ‘two-parameter criterion’ was established and calibrated based on the correlation between stress triaxiality and equivalent plastic strain at failure initiation. The proposed criterion demonstrated substantial agreement with the experimental results, thus enhancing our knowledge of ductile fracture behavior in this steel grade. The implementation of X-ray CT and fractography analysis provided new insights into the diverse roles played by different populations of second-phase particles in fracture initiation under varying stress triaxiality conditions.

Keywords: ductile fracture, two-parameter criterion, x-ray computed tomography, stress triaxiality

Procedia PDF Downloads 73
3043 Choosing an Optimal Epsilon for Differentially Private Arrhythmia Analysis

Authors: Arin Ghazarian, Cyril Rakovski

Abstract:

Differential privacy has become the leading technique to protect the privacy of individuals in a database while allowing useful analysis to be done and the results to be shared. It puts a guarantee on the amount of privacy loss in the worst-case scenario. Differential privacy is not a toggle between full privacy and zero privacy. It controls the tradeoff between the accuracy of the results and the privacy loss using a single key parameter called

Keywords: arrhythmia, cardiology, differential privacy, ECG, epsilon, medi-cal data, privacy preserving analytics, statistical databases

Procedia PDF Downloads 134
3042 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 75
3041 Implementing of Indoor Air Quality Index in Hong Kong

Authors: Kwok W. Mui, Ling T. Wong, Tsz W. Tsang

Abstract:

Many Hong Kong people nowadays spend most of their lifetime working indoor. Since poor Indoor Air Quality (IAQ) potentially leads to discomfort, ill health, low productivity and even absenteeism in workplaces, a call for establishing statutory IAQ control to safeguard the well-being of residents is urgently required. Although policies, strategies, and guidelines for workplace IAQ diagnosis have been developed elsewhere and followed with remedial works, some of those workplaces or buildings have relatively late stage of the IAQ problems when the investigation or remedial work started. Screening for IAQ problems should be initiated as it will provide information as a minimum provision of IAQ baseline requisite to the resolution of the problems. It is not practical to sample all air pollutants that exit. Nevertheless, as a statutory control, reliable, rapid screening is essential in accordance with a compromise strategy, which balances costs against detection of key pollutants. This study investigates the feasibility of using an IAQ index as a parameter of IAQ control in Hong Kong. The index is a screening parameter to identify the unsatisfactory workplace IAQ and will highlight where a fully effective IAQ monitoring and assessment is needed for an intensive diagnosis. There already exist a number of representative common indoor pollutants based on some extensive IAQ assessments. The selection of pollutants is surrogate to IAQ control consists of dilution, mitigation, and emission control. The IAQ Index and assessment will look at high fractional quantities of these common measurement parameters. With the support of the existing comprehensive regional IAQ database and the IAQ Index by the research team as the pre-assessment probability, and the unsatisfactory IAQ prevalence as the post-assessment probability from this study, thresholds of maintaining the current measures and performing a further IAQ test or IAQ remedial measures will be proposed. With justified resources, the proposed IAQ Index and assessment protocol might be a useful tool for setting up a practical public IAQ surveillance programme and policy in Hong Kong.

Keywords: assessment, index, indoor air quality, surveillance programme

Procedia PDF Downloads 256
3040 A Comparative Study of the Proposed Models for the Components of the National Health Information System

Authors: M. Ahmadi, Sh. Damanabi, F. Sadoughi

Abstract:

National Health Information System plays an important role in ensuring timely and reliable access to Health information which is essential for strategic and operational decisions that improve health, quality and effectiveness of health care. In other words, by using the National Health information system you can improve the quality of health data, information and knowledge used to support decision making at all levels and areas of the health sector. Since full identification of the components of this system for better planning and management influential factors of performance seems necessary, therefore, in this study, different attitudes towards components of this system are explored comparatively. Methods: This is a descriptive and comparative kind of study. The society includes printed and electronic documents containing components of the national health information system in three parts: input, process, and output. In this context, search for information using library resources and internet search were conducted and data analysis was expressed using comparative tables and qualitative data. Results: The findings showed that there are three different perspectives presenting the components of national health information system, Lippeveld, Sauerborn, and Bodart Model in 2000, Health Metrics Network (HMN) model from World Health Organization in 2008 and Gattini’s 2009 model. All three models outlined above in the input (resources and structure) require components of management and leadership, planning and design programs, supply of staff, software and hardware facilities, and equipment. In addition, in the ‘process’ section from three models, we pointed up the actions ensuring the quality of health information system and in output section, except Lippeveld Model, two other models consider information products, usage and distribution of information as components of the national health information system. Conclusion: The results showed that all the three models have had a brief discussion about the components of health information in input section. However, Lippeveld model has overlooked the components of national health information in process and output sections. Therefore, it seems that the health measurement model of network has a comprehensive presentation for the components of health system in all three sections-input, process, and output.

Keywords: National Health Information System, components of the NHIS, Lippeveld Model

Procedia PDF Downloads 406
3039 An Extension of the Generalized Extreme Value Distribution

Authors: Serge Provost, Abdous Saboor

Abstract:

A q-analogue of the generalized extreme value distribution which includes the Gumbel distribution is introduced. The additional parameter q allows for increased modeling flexibility. The resulting distribution can have a finite, semi-infinite or infinite support. It can also produce several types of hazard rate functions. The model parameters are determined by making use of the method of maximum likelihood. It will be shown that it compares favourably to three related distributions in connection with the modeling of a certain hydrological data set.

Keywords: extreme value theory, generalized extreme value distribution, goodness-of-fit statistics, Gumbel distribution

Procedia PDF Downloads 332
3038 Data Protection and Regulation Compliance on Handling Physical Child Abuse Scenarios- A Scoping Review

Authors: Ana Mafalda Silva, Rebeca Fontes, Ana Paula Vaz, Carla Carreira, Ana Corte-Real

Abstract:

Decades of research on the topic of interpersonal violence against minors highlight five main conclusions: 1) it causes harmful effects on children's development and health; 2) it is prevalent; 3) it violates children's rights; 4) it can be prevented and 5) parents are the main aggressors. The child abuse scenario is identified through clinical observation, administrative data and self-reports. The most used instruments are self-reports; however, there are no valid and reliable self-report instruments for minors, which consist of a retrospective interpretation of the situation by the victim already in her adult phase and/or by her parents. Clinical observation and collection of information, namely from the orofacial region, are essential in the early identification of these situations. The management of medical data, such as personal data, must comply with the General Data Protection Regulation (GDPR), in Europe, and with the General Law of Data Protection (LGPD), in Brazil. This review aims to answer the question: In a situation of medical assistance to minors, in the suspicion of interpersonal violence, due to mistreatment, is it necessary for the guardians to provide consent in the registration and sharing of personal data, namely medical ones. A scoping review was carried out based on a search by the Web of Science and Pubmed search engines. Four papers and two documents from the grey literature were selected. As found, the process of identifying and signaling child abuse by the health professional, and the necessary early intervention in defense of the minor as a victim of abuse, comply with the guidelines expressed in the GDPR and LGPD. This way, the notification in maltreatment scenarios by health professionals should be a priority and there shouldn’t be the fear or anxiety of legal repercussions that stands in the way of collecting and treating the data necessary for the signaling procedure that safeguards and promotes the welfare of children living with abuse.

Keywords: child abuse, disease notifications, ethics, healthcare assistance

Procedia PDF Downloads 82
3037 Normal Weight Obesity among Female Students: BMI as a Non-Sufficient Tool for Obesity Assessment

Authors: Krzysztof Plesiewicz, Izabela Plesiewicz, Krzysztof Chiżyński, Marzenna Zielińska

Abstract:

Background: Obesity is an independent risk factor for cardiovascular diseases. There are several anthropometric parameters proposed to estimate the level of obesity, but until now there is no agreement which one is the best predictor of cardiometabolic risk. Scientists defined metabolically obese normal weight, who suffer from metabolic abnormalities, the same as obese individuals, and defined this syndrome as normal weight obesity (NWO). Aim of the study: The aim of our study was to determine the occurrence of overweight and obesity in a cohort of young, adult women, using standard and complementary methods of obesity assessment and to indicate those, who are at risk of obesity. The second aim of our study was to test additional methods of obesity assessment and proof that body mass index using alone is not sufficient parameter of obesity assessment. Materials and methods: 384 young women, aged 18-32, were enrolled into the study. Standard anthropometric parameters (waist to hips ratio (WTH), waist to height ratio (WTHR)) and two other methods of body fat percentage measurement (BFPM) were used in the study: electrical bioimpendance analysis (BIA) and skinfold measurement test by digital fat body mass clipper (SFM). Results: In the study group 5% and 7% of participants had waist to hips ratio and accordingly waist to height ratio values connected with visceral obesity. According to BMI 14% participants were overweight and obese. Using additional methods of body fat assessment, there were 54% and 43% of obese for BIA and SMF method. In the group of participants with normal BMI and underweight (not overweight, n =340) there were individuals with the level of BFPM above the upper limit, for the BIA 49% (n =164) and for the SFM 36 % (n=125). Statistical analysis revealed strong correlation between BIA and SFM methods. Conclusion: BMI using alone is not a sufficient parameter of obesity assessment. High percentage of young women with normal BMI values seem to be normal weight obese.

Keywords: electrical bioimpedance, normal weight obesity, skin-fold measurement test, women

Procedia PDF Downloads 259