Search results for: instrumental variable estimation
1430 Technology in the Calculation of People Health Level: Design of a Computational Tool
Authors: Sara Herrero Jaén, José María Santamaría García, María Lourdes Jiménez Rodríguez, Jorge Luis Gómez González, Adriana Cercas Duque, Alexandra González Aguna
Abstract:
Background: Health concept has evolved throughout history. The health level is determined by the own individual perception. It is a dynamic process over time so that you can see variations from one moment to the next. In this way, knowing the health of the patients you care for, will facilitate decision making in the treatment of care. Objective: To design a technological tool that calculates the people health level in a sequential way over time. Material and Methods: Deductive methodology through text analysis, extraction and logical knowledge formalization and education with expert group. Studying time: September 2015- actually. Results: A computational tool for the use of health personnel has been designed. It has 11 variables. Each variable can be given a value from 1 to 5, with 1 being the minimum value and 5 being the maximum value. By adding the result of the 11 variables we obtain a magnitude in a certain time, the health level of the person. The health calculator allows to represent people health level at a time, establishing temporal cuts being useful to determine the evolution of the individual over time. Conclusion: The Information and Communication Technologies (ICT) allow training and help in various disciplinary areas. It is important to highlight their relevance in the field of health. Based on the health formalization, care acts can be directed towards some of the propositional elements of the concept above. The care acts will modify the people health level. The health calculator allows the prioritization and prediction of different strategies of health care in hospital units.Keywords: calculator, care, eHealth, health
Procedia PDF Downloads 2651429 Uncertainty Assessment in Building Energy Performance
Authors: Fally Titikpina, Abderafi Charki, Antoine Caucheteux, David Bigaud
Abstract:
The building sector is one of the largest energy consumer with about 40% of the final energy consumption in the European Union. Ensuring building energy performance is of scientific, technological and sociological matter. To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared with the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of dynamic and static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the \textit{Guide to the Expression of Measurement Uncertainty (GUM)} as well as by Bayesian Statistical Theory (BST). Another choice is the use of numerical methods like Monte Carlo Simulation (MCS). In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST) is given. Therefore, an office building has been monitored and multiple sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 $m^2$. Temperature data, electrical and heating consumption, windows opening and occupancy rate are the features for our research work.Keywords: building energy performance, uncertainty evaluation, GUM, bayesian approach, monte carlo method
Procedia PDF Downloads 4601428 Revisiting Politics of Religion in Muslim Republics of Former Soviet Union and Rise of Extremism, Global Jihadi Terrorism
Authors: Etibar Guliyev
Abstract:
The breakdown of the Soviet Union in 1991 has led to a considerable rise in the religious self-consciousness of Muslim population of the Central Asia. Additionally, huge amount of money spent by various states further facilitated the spread of religious ideas. According to some sources, Saudi Arabia spent 87 billion dollars to propagate Wahhabism abroad during two decades, whereas the Communist Party of the Soviet Union spent just over 7 billion dollars to spread its ideology worldwide between 1921 and 1991. As the result, today once a remote area from international politics has turned into third major source of recruitment of fighters for global terrorist organizations. In order to illustrate to scope of the involvement of the Central Asian residents in international terrorist networks it is enough to mention the name of Colonel Gulmorod Khalimov, the former head of the Tajik special police forces who served as ISIS war minister between 2016 and 2017. The importance of the topic stems from the fact that the above-mentioned republics with a territory of 4 million square km and the population of around 80 million people borders Russia, Iran Afghanistan and China. Moreover, the fact that political and military activities motivated with religious feelings in those countries have implications not only for domestic but also for regional and global political relations and all of them has root in politics of religions adds value to the research. This research aims to provide an in-depth analyses of the marked features of the state policies to regulate religious activities and approach this question both from individual, domestic, regional and global levels of analyses. The research will enable us to better understand what implications have the state of religious freedom in post-Soviet Muslim republics for international relations and the rise of global jihadi terrorism. The paper tries to find a linkage between the mentioned terror attacks and underground rise of religious extremism in Central Asia. This research is based on multiple research methods, mainly on qualitative one. The process tracing method is also employed to review religious policies implemented from 1918-1991 and after the collapse of the Soviet Union in a chronological way. In terms of the quantitative method, it chiefly will be used in a bid to process various statistics disseminated in academic and official sources. The research mostly explored constructivist, securitization and social movement theories. Findings of the research suggests that the endemic problems peculiar to authoritarian regimes of Central Asia such as crackdown on the expression of religious believe and any kind of opposition, economic decline, instrumental use of religion and corruption and tribalism further accelerated the recruitment problem. Paper also concludes that the Central Asian states in some cases misused counter-terrorism campaign as a pretext to further restrict freedom of faith in their respective countries.Keywords: identity, political Islam, religious extremism, security, terrorism
Procedia PDF Downloads 2681427 Evidence of Total Mercury Biomagnification in Tropical Estuary Lagoon in East Coast of Peninsula, Malaysia
Authors: Quang Dung Le, Kentaro Tanaka, Viet Dung Luu, Kotaro Shirai
Abstract:
Mercury pollutant is great concerns in globe due to its toxicity and biomagnification through the food web. Recently increasing approaches of stable isotope analyses which have applied in food-web structure are enabled to elucidate more insight trophic transfer of pollutants in ecosystems. In this study, the integration of total mercury (Hg) and stable isotopic analyses (δ13C and δ15N) were measured from basal food sources to invertebrates and fishes in order to determine Hg transfer in Setiu lagoon food webs. The average Hg concentrations showed the increasing trend from low to high trophic levels. The result also indicated that potential Hg exposure from inside mangrove could be higher than that from the tidal flat of mangrove creek. Fish Hg concentrations are highly variable, and many factors driving this variability need further examinations. A positive correlation found between Hg concentrations and δ15N values (the trophic magnification factor was 3.02), suggesting Hg biomagnification through the lagoon food web. Almost all Hg concentrations in fishes and mud crabs did not present a risk for human consumption, however, the Hg concentrations of Caranx ignobilis exceed the permitted level could raise a concern of the potential risk for the marine system. Further investigations should be done to elucidate whether trophic relay relates to high Hg concentrations of some fish species in coastal systems.Keywords: mercury, transfer, stable isotopes, health risk, mangrove, food web
Procedia PDF Downloads 3091426 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. These findings need to be confirmed with a greater number of stations across other Australian states.Keywords: floods, FLIKE, probability distributions, flood frequency, outlier
Procedia PDF Downloads 4521425 Disaggregation of Coarser Resolution Radiometer Derived Soil Moisture to Finer Scales
Authors: Gurjeet Singh, Rabindra K. Panda
Abstract:
Soil moisture is a key hydrologic state variable and is intrinsically linked to the Earth's water, climate and carbon cycles. On ecological point of view, the soil moisture is a fundamental natural resource providing the transpirable water for plants. Soil moisture varies both temporally and spatially due to spatiotemporal variation in rainfall, vegetation cover, soil properties and topography. Satellite derived soil moisture provides spatio-temporal extensive data. However, the spatial resolution of a typical satellite (L-band radiometry) is of the order of tens of kilometers, which is not good enough for developing efficient agricultural water management schemes at the field scale. In the present study, the soil moisture from radiometer data has been disaggregated using blending approach to achieve higher resolution soil moisture data. The radiometer estimates of soil moisture at a 40 km resolution have been disaggregated to 10 km, 5 km and 1 km resolutions. The disaggregated soil moisture was compared with the observed data, consisting of continuous sensor based soil moisture profile measurements, at three monitoring sites and extensive spatial near-surface soil moisture measurements, concurrent with satellite monitoring in the 500 km2 study watershed in the Eastern India. The estimated soil moisture status at different spatial scales can help in developing efficient agricultural water management schemes to increase the crop production and water use efficiency.Keywords: disaggregation, eastern India, radiometers, soil moisture, water use efficiency
Procedia PDF Downloads 2761424 Numerical Study of Jet Impingement Heat Transfer
Authors: A. M. Tiara, Sudipto Chakraborty, S. K. Pal
Abstract:
Impinging jets and their different configurations are important from the viewpoint of the fluid flow characteristics and their influence on heat transfer from metal surfaces due to their complex flow characteristics. Such flow characteristics results in highly variable heat transfer from the surface, resulting in varying cooling rates which affects the mechanical properties including hardness and strength. The overall objective of the current research is to conduct a fundamental investigation of the heat transfer mechanisms for an impinging coolant jet. Numerical simulation of the cooling process gives a detailed analysis of the different parameters involved even though employing Computational Fluid Dynamics (CFD) to simulate the real time process, being a relatively new research area, poses many challenges. The heat transfer mechanism in the current research is actuated by jet cooling. The computational tool used in the ongoing research for simulation of the cooling process is ANSYS Workbench software. The temperature and heat flux distribution along the steel strip with the effect of various flow parameters on the heat transfer rate can be observed in addition to determination of the jet impingement patterns, which is the major aim of the present analysis. Modelling both jet and air atomized cooling techniques using CFD methodology and validating with those obtained experimentally- including trial and error with different models and comparison of cooling rates from both the techniques have been included in this work. Finally some concluding remarks are made that identify some gaps in the available literature that have influenced the path of the current investigation.Keywords: CFD, heat transfer, impinging jets, numerical simulation
Procedia PDF Downloads 2361423 Nucleotide Diversity and Bacterial Endosymbionts of the Black Cherry Aphid Myzus cerasi (Fabricus, 1775) (Hemiptera: Aphididae) from Turkey
Authors: Burcu Inal, Irfan Kandemir
Abstract:
Sequences of mitochondrial cytochrome oxidase I (COI) gene of twenty-five Turkish and one Greek Myzus cerasi (Fabricus) (Hemiptera: Aphididae) in populations were collected from Prunus avium and Prunus cerasus. The partial coding region of COI studied is 605 bp for all the populations, from which 565 nucleotides were conserved, 40 were variable, 37 were singleton, and 3 sites were parsimony-informative. Four haplotypes were identified based on nucleotide substitutions, and the mean of intraspecific divergence was calculated to be 0.3%. Phylogenetic trees were constructed using Maximum Likelihood, Minimum Evolution, Neighbor-joining, and Unweighed Pair Group Method of Arithmetic Averages (UPGMA) and Myzus persicae (Sulzer) and Myzus borealis Ossiannilson were included as outgroups. The population of M. cerasi from Isparta diverged from the rest of the groups and formed a clade (Haplotype B) with Myzus borealis. The rest of the haplotype diversity includes Haplotype A and Haplotype C with individuals characterized as Myzus cerasi pruniavium and Haplotype D with Myzus cerasi cerasi. M. cerasi diverge into two subspecies and it must be reevaluated whether this pest is monophagous or oligophagous in terms of plant type dependence. The obligated endosymbiont Buchnera aphidicola was also found during this research, but no facultative symbionts could be found. It is expected further studies will be required for a complete barcoding and diversity of bacterial endosymbionts present.Keywords: bacterial endosymbionts, barcoding, black cherry aphid, nucleotide diversity
Procedia PDF Downloads 1731422 A-Score, Distress Prediction Model with Earning Response during the Financial Crisis: Evidence from Emerging Market
Authors: Sumaira Ashraf, Elisabete G.S. Félix, Zélia Serrasqueiro
Abstract:
Traditional financial distress prediction models performed well to predict bankrupt and insolvent firms of the developed markets. Previous studies particularly focused on the predictability of financial distress, financial failure, and bankruptcy of firms. This paper contributes to the literature by extending the definition of financial distress with the inclusion of early warning signs related to quotation of face value, dividend/bonus declaration, annual general meeting, and listing fee. The study used five well-known distress prediction models to see if they have the ability to predict early warning signs of financial distress. Results showed that the predictive ability of the models varies over time and decreases specifically for the sample with early warning signs of financial distress. Furthermore, the study checked the differences in the predictive ability of the models with respect to the financial crisis. The results conclude that the predictive ability of the traditional financial distress prediction models decreases for the firms with early warning signs of financial distress and during the time of financial crisis. The study developed a new model comprising significant variables from the five models and one new variable earning response. This new model outperforms the old distress prediction models before, during and after the financial crisis. Thus, it can be used by researchers, organizations and all other concerned parties to indicate early warning signs for the emerging markets.Keywords: financial distress, emerging market, prediction models, Z-Score, logit analysis, probit model
Procedia PDF Downloads 2441421 The Effects of Time and Cyclic Loading to the Axial Capacity for Offshore Pile in Shallow Gas
Authors: Christian H. Girsang, M. Razi B. Mansoor, Noorizal N. Huang
Abstract:
An offshore platform was installed in 1977 at about 260km offshore West Malaysia at the water depth of 73.6m. Twelve (12) piles were installed with four (4) are skirt piles. The piles have 1.219m outside diameter and wall thickness of 31mm and were driven to 109m below seabed. Deterministic analyses of the pile capacity under axial loading were conducted using the current API (American Petroleum Institute) method and the four (4) CPT-based methods: the ICP (Imperial College Pile)-method, the NGI (Norwegian Geotechnical Institute)-Method, the UWA (University of Western Australia)-method and the Fugro-method. A statistical analysis of the model uncertainty associated with each pile capacity method was performed. There were two (2) piles analysed: Pile 1 and piles other than Pile 1, where Pile 1 is the pile that was most affected by shallow gas problems. Using the mean estimate of soil properties, the five (5) methods used for deterministic estimation of axial pile capacity in compression predict an axial capacity from 28 to 42MN for Pile 1 and 32 to 49MN for piles other than Pile 1. These values refer to the static capacity shortly after pile installation. They do not include the effects of cyclic loading during the design storm or time after installation on the axial pile capacity. On average, the axial pile capacity is expected to have increased by about 40% because of ageing since the installation of the platform in 1977. On the other hand, the cyclic loading effects during the design storm may reduce the axial capacity of the piles by around 25%. The study concluded that all piles have sufficient safety factor when the pile aging and cyclic loading effect are considered, as all safety factors are above 2.0 for maximum operating and storm loads.Keywords: axial capacity, cyclic loading, pile ageing, shallow gas
Procedia PDF Downloads 3451420 Statistical Design of Synthetic VP X-bar Control Chat Using Markov Chain Approach
Authors: Ali Akbar Heydari
Abstract:
Control charts are an important tool of statistical quality control. Thesecharts are used to detect and eliminate unwanted special causes of variation that occurred during aperiod of time. The design and operation of control charts require the determination of three design parameters: the sample size (n), the sampling interval (h), and the width coefficient of control limits (k). Thevariable parameters (VP) x-bar controlchart is the x-barchart in which all the design parameters vary between twovalues. These values are a function of the most recent process information. In fact, in the VP x-bar chart, the position of each sample point on the chart establishes the size of the next sample and the timeof its sampling. The synthetic x-barcontrol chartwhich integrates the x-bar chart and the conforming run length (CRL) chart, provides significant improvement in terms of detection power over the basic x-bar chart for all levels of mean shifts. In this paper, we introduce the syntheticVP x-bar control chart for monitoring changes in the process mean. To determine the design parameters, we used a statistical design based on the minimum out of control average run length (ARL) criteria. The optimal chart parameters of the proposed chart are obtained using the Markov chain approach. A numerical example is also done to show the performance of the proposed chart and comparing it with the other control charts. The results show that our proposed syntheticVP x-bar controlchart perform better than the synthetic x-bar controlchart for all shift parameter values. Also, the syntheticVP x-bar controlchart perform better than the VP x-bar control chart for the moderate or large shift parameter values.Keywords: control chart, markov chain approach, statistical design, synthetic, variable parameter
Procedia PDF Downloads 1551419 Harnessing Nature's Fury: Hyptis Suaveolens Loaded Bioactive Liposome for Photothermal Therapy of Lung Cancer
Authors: Sajmina Khatun, Monika Pebam, Aravind Kumar Rengan
Abstract:
Photothermal therapy, a subset of nanomedicine, takes advantage of light-absorbing agents to generate localized heat, selectively eradicating cancer cells. This innovative approach minimizes damage to healthy tissues and offers a promising avenue for targeted cancer treatment. Unlike conventional therapies, photothermal therapy harnesses the power of light to combat malignancies precisely and effectively, showcasing its potential to revolutionize cancer treatment paradigms. The combined strengths of nanomedicine and photothermal therapy signify a transformative shift toward more effective, targeted, and tolerable cancer treatments in the medical landscape. Utilizing natural products becomes instrumental in formulating diverse bioactive medications owing to their various pharmacological properties attributed to the existence of phenolic structures, triterpenoids, and similar compounds. Hyptis suaveolens, commonly known as pignut, stands as an aromatic herb within the Lamiaceae family and represents a valuable therapeutic plant. Flourishing in swamps and alongside tropical and subtropical roadsides, these noxious weeds impede the development of adjacent plants. Hyptis suaveolens ranks among the most globally distributed alien invasive species. The present investigation revealed that a versatile, biodegradable liposome nanosystem (HIL NPs), incorporating bioactive molecules from Hyptis suaveolens, exhibits effective bioavailability to cancer cells, enabling tumor ablation upon near-infrared (NIR) laser exposure. The components within the nanosystem, specifically the bioactive molecules from Hyptis, function as anticancer agents, aiding in the photothermal ablation of highly metastatic lung cancer cells. Despite being a prolific weed impeding neighboring plant growth, Hyptis suaveolens showcases therapeutic benefits through its bioactive compounds. The obtained HIL NPs, characterized as a photothermally active liposome nanosystem, demonstrate a pronounced fluorescence absorption peak in the NIR range and achieve a high photothermal conversion efficiency under NIR laser irradiation. Transmission electron microscopy (TEM) and particle size analysis reveal that HIL NPs possess a spherical shape with a size of 141 ± 30 nm. Moreover, in vitro assessments of HIL NPs against lung cancer cell lines (A549) indicate effective anticancer activity through a combined cytotoxic effect and hyperthermia. Tumor ablation is facilitated by apoptosis induced by the overexpression of ɣ-H2AX, arresting cancer cell proliferation. Consequently, the multifunctional and biodegradable nanosystem (HIL NPs), incorporating bioactive compounds from Hyptis, provides valuable perspectives for developing an innovative therapeutic strategy originating from a challenging weed. This approach holds promise for potential applications in both bioimaging and the combined use of phyto-photothermal therapy for cancer treatment.Keywords: bioactive liposome, hyptis suaveolens, photothermal therapy, lung cancer
Procedia PDF Downloads 951418 Determining Variables in Mathematics Performance According to Gender in Mexican Elementary School
Authors: Nora Gavira Duron, Cinthya Moreda Gonzalez-Ortega, Reyna Susana Garcia Ruiz
Abstract:
This paper objective is to analyze the mathematics performance in the Learning Evaluation National Plan (PLANEA for its Spanish initials: Plan Nacional para la Evaluación de los Aprendizajes), applied to Mexican students who are enrolled in the last elementary-school year over the 2017-2018 academic year. Such test was conducted nationwide in 3,573 schools, using a sample of 108,083 students, whose average in mathematics, on a scale of 0 to 100, was 45.6 points. 75% of the sample analyzed did not reach the sufficiency level (60 points). It should be noted that only 2% got a 90 or higher score result. The performance is analyzed while considering whether there are differences in gender, marginalization level, public or private school enrollment, parents’ academic background, and living-with-parents situation. Likewise, this variable impact (among other variables) on school performance by gender is evaluated, considering multivariate logistic (Logit) regression analysis. The results show there are no significant differences in mathematics performance regarding gender in elementary school; nevertheless, the impact exerted by mothers who studied at least high school is of great relevance for students, particularly for girls. Other determining variables are students’ resilience, their parents’ economic status, and the fact they attend private schools, strengthened by the mother's education.Keywords: multivariate regression analysis, academic performance, learning evaluation, mathematics result per gender
Procedia PDF Downloads 1491417 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 471416 Isolation and Molecular IdentıFıCation of Polyethylene Degrading Bacteria From Soil and Degradation Detection by FTIR Analysis
Authors: Morteza Haghi, Cigdem Yilmazbas, Ayse Zeynep Uysal, Melisa Tepedelen, Gozde Turkoz Bakirci
Abstract:
Today, the increase in plastic waste accumulation is an inescapable consequence of environmental pollution; the disposal of these wastes has caused a significant problem. Variable methods have been utilized; however, biodegradation is the most environmentally friendly and low-cost method. Accordingly, the present study aimed to isolate the bacteria capable of biodegradation of plastics. In doing so, we applied the liquid carbon-free basal medium (LCFBM) prepared with deionized water for the isolation of bacterial species obtained from soil samples taken from the Izmir Menemen region. Isolates forming biofilms on plastic were selected and named (PLB3, PLF1, PLB1B) and subjected to a degradation test. FTIR analysis, 16s rDNA amplification, sequencing, identification of isolates were performed. Finally, at the end of the process, a mass loss of 16.6% in PLB3 isolate and 25% in PLF1 isolate was observed, while no mass loss was detected in PLB1B isolate. Only PLF1 and PLB1B created transparent zones on plastic texture. Considering the FTIR result, PLB3 changed plastic structure by 13.6% and PLF1 by 17%, while PLB1B did not change the plastic texture. According to the 16s rDNA sequence analysis, FLP1, PLB1B, and PLB3 isolates were identified as Streptomyces albogriseolus, Enterobacter cloacae, and Klebsiella pneumoniae, respectively.Keywords: polyethylene, biodegradation, bacteria, 16s rDNA, FTIR
Procedia PDF Downloads 2041415 Estimation of Twist Loss in the Weft Yarn during Air-Jet Weft Insertion
Authors: Muhammad Umair, Yasir Nawab, Khubab Shaker, Muhammad Maqsood, Adeel Zulfiqar, Danish Mahmood Baitab
Abstract:
Fabric is a flexible woven material consisting of a network of natural or artificial fibers often referred to as thread or yarn. Today fabrics are produced by weaving, braiding, knitting, tufting and non-woven. Weaving is a method of fabric production in which warp and weft yarns are interlaced perpendicular to each other. There is infinite number of ways for the interlacing of warp and weft yarn. Each way produces a different fabric structure. The yarns parallel to the machine direction are called warp yarns and the yarns perpendicular to the machine direction are called weft or filling yarns. Air jet weaving is the modern method of weft insertion and considered as high speed loom. The twist loss in air jet during weft insertion affects the strength. The aim of this study was to investigate the effect of twist change in weft yarn during air-jet weft insertion. A total number of 8 samples were produced using 1/1 plain and 3/1 twill weave design with two fabric widths having same loom settings. Two different types of yarns like cotton and PC blend were used. The effect of material type, weave design and fabric width on twist change of weft yarn was measured and discussed. Twist change in the different types of weft yarn and weave design was measured and compared the twist change in the weft yarn with the yarn before weft yarn insertion and twist loss is measured. Wider fabric leads to higher twist loss in the yarn.Keywords: air jet loom, twist per inch, twist loss, weft yarn
Procedia PDF Downloads 4041414 Reconstructability Analysis for Landslide Prediction
Authors: David Percy
Abstract:
Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.Keywords: reconstructability analysis, machine learning, landslides, raster analysis
Procedia PDF Downloads 691413 Factors Affecting eHealth Literacy among Nursing Students in Jordan
Authors: Laila Habiballah, Ahmad Tubaishat
Abstract:
Background: with the development of information and communication technology, using the internet as a source to obtain health information is increasing. Nursing students as future health care providers should have the skills of locating, evaluating and using online health information. This will enable them to help their patients and families to make informed decisions. Aim: this study has a two-fold aim. The first is to assess the eHealth literacy among nursing students in Jordan. The second aim is to explore the factors that have an effect on the eHealth literacy. Methods: this is a descriptive cross-sectional survey that conducted in two universities in Jordan; public and private one. A number of 541 students from both universities were completed the eHEALS scale, which is an instrument designed to measure the eHealth literacy. Some additional personal and demographical variable were collected to explore its effect on eHealth literacy. Results: Students have a high perceived level of e-Health literacy (M=3.62, SD=0.58). They are aware of the available online health resources, know how to search, locate, and use these resources. But, they do not have the skills to evaluate these resources and cannot differentiate between the high and low-quality resources. The results showed as well that type of university, type of students' admission, academic level, students' skills of using the internet, and the perception of usefulness and importance of internet have an effect on the eHealth literacy. While the age, gender, GPA, and the frequency of using the internet was no significant factors. Conclusion: This study represents a baseline reference for the eHealth literacy in Jordan. Students have some skills of eHealth literacy and other skills need to be improved. Nursing educators and administrators should integrate and incorporate the skills of eHealth literacy in the curriculum.Keywords: eHealth, literacy, nursing, students, Jordan
Procedia PDF Downloads 4011412 The Efficiency of AFLP and ISSR Markers in Genetic Diversity Estimation and Gene Pool Classification of Iranian Landrace Bread Wheat (Triticum Aestivum L.) Germplasm
Authors: Reza Talebi
Abstract:
Wheat (Triticum aestivum) is one of the most important food staples in Iran. Understanding genetic variability among the landrace wheat germplasm is important for breeding. Landraces endemic to Iran are a genetic resource that is distinct from other wheat germplasm. In this study, 60 Iranian landrace wheat accessions were characterized AFLP and ISSR markers. Twelve AFLP primer pairs detected 128 polymorphic bands among the sixty genotypes. The mean polymorphism rate based on AFLP data was 31%; however, a wide polymorphism range among primer pairs was observed (22–40%). Polymorphic information content (PIC value) calculated to assess the informativeness of each marker ranged from 0.28 to 0.4, with a mean of 0.37. According to AFLP molecular data, cluster analysis grouped the genotypes in five distinct clusters. .ISSR markers generated 68 bands (average of 6 bands per primer), which 31 were polymorphic (45%) across the 60 wheat genotypes. Polymorphism information content (PIC) value for ISSR markers was calculated in the range of 0.14 to 0.48 with an average of 0.33. Based on data achieved by ISSR-PCR, cluster analysis grouped the genotypes in three distinct clusters. Both AFLP and ISSR markers able to showed that high level of genetic diversity in Iranian landrace wheat accessions has maintained a relatively constant level of genetic diversity during last years.Keywords: wheat, genetic diversity, AFLP, ISSR
Procedia PDF Downloads 4521411 Effect of Quenching Medium on the Hardness of Dual Phase Steel Heat Treated at a High Temperature
Authors: Tebogo Mabotsa, Tamba Jamiru, David Ibrahim
Abstract:
Dual phase(DP) steel consists essentially of fine grained equiaxial ferrite and a dispersion of martensite. Martensite is the primary precipitate in DP steels, it is the main resistance to dislocation motion within the material. The objective of this paper is to present a relation between the intercritical annealing holding time and the hardness of a dual phase steel. The initial heat treatment involved heating the specimens to 1000oC and holding the sample at that temperature for 30 minutes. After the initial heat treatment, the samples were heated to 770oC and held for a varying amount of time at constant temperature. The samples were held at 30, 60, and 90 minutes respectively. After heating and holding the samples at the austenite-ferrite phase field, the samples were quenched in water, brine, and oil for each holding time. The experimental results proved that an equation for predicting the hardness of a dual phase steel as a function of the intercritical holding time is possible. The relation between intercritical annealing holding time and hardness of a dual phase steel heat treated at high temperatures is parabolic in nature. Theoretically, the model isdependent on the cooling rate because the model differs for each quenching medium; therefore, a universal hardness equation can be derived where the cooling rate is a variable factor.Keywords: quenching medium, annealing temperature, dual phase steel, martensite
Procedia PDF Downloads 821410 Variability of Climatic Elements in Nigeria Over Recent 100 Years
Authors: T. Salami, O. S. Idowu, N. J. Bello
Abstract:
Climatic variability is an essential issue when dealing with the issue of climate change. Variability of some climate parameter helps to determine how variable the climatic condition of a region will behave. The most important of these climatic variables which help to determine the climatic condition in an area are both the Temperature and Precipitation. This research deals with Longterm climatic variability in Nigeria. Variables examined in this analysis include near-surface temperature, near surface minimum temperature, maximum temperature, relative humidity, vapour pressure, precipitation, wet-day frequency and cloud cover using data ranging between 1901-2010. Analyses were carried out and the following methods were used: - Regression and EOF analysis. Results show that the annual average, minimum and maximum near-surface temperature all gradually increases from 1901 to 2010. And they are in the same case in a wet season and dry season. Minimum near-surface temperature, with its linear trends are significant for annual, wet season and dry season means. However, the diurnal temperature range decreases in the recent 100 years imply that the minimum near-surface temperature has increased more than the maximum. Both precipitation and wet day frequency decline from the analysis, demonstrating that Nigeria has become dryer than before by the way of rainfall. Temperature and precipitation variability has become very high during these periods especially in the Northern areas. Areas which had excessive rainfall were confronted with flooding and other related issues while area that had less precipitation were all confronted with drought. More practical issues will be presented.Keywords: climate, variability, flooding, excessive rainfall
Procedia PDF Downloads 3851409 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator
Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard
Abstract:
Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.Keywords: blade tip timing, blisk, finite element, vibration measurement
Procedia PDF Downloads 3121408 Estimation of the Seismic Response Modification Coefficient in the Superframe Structural System
Authors: Ali Reza Ghanbarnezhad Ghazvini, Seyyed Hamid Reza Mosayyebi
Abstract:
In recent years, an earthquake has occurred approximately every five years in certain regions of Iran. To mitigate the impact of these seismic events, it is crucial to identify and thoroughly assess the vulnerability of buildings and infrastructure, ensuring their safety through principled reinforcement. By adopting new methods of risk assessment, we can effectively reduce the potential risks associated with future earthquakes. In our research, we have observed that the coefficient of behavior in the fourth chapter is 1.65 for the initial structure and 1.72 for the Superframe structure. This indicates that the Superframe structure can enhance the strength of the main structural members by approximately 10% through the utilization of super beams. Furthermore, based on the comparative analysis between the two structures conducted in this study, we have successfully designed a stronger structure with minimal changes in the coefficient of behavior. Additionally, this design has allowed for greater energy dissipation during seismic events, further enhancing the structure's resilience to earthquakes. By comprehensively examining and reinforcing the vulnerability of buildings and infrastructure, along with implementing advanced risk assessment techniques, we can significantly reduce casualties and damages caused by earthquakes in Iran. The findings of this study offer valuable insights for civil engineering professionals in the field of structural engineering, aiding them in designing safer and more resilient structures.Keywords: modal pushover analysis, response modification factor, high-strength concrete, concrete shear walls, high-rise building
Procedia PDF Downloads 1471407 Modelling Conceptual Quantities Using Support Vector Machines
Authors: Ka C. Lam, Oluwafunmibi S. Idowu
Abstract:
Uncertainty in cost is a major factor affecting performance of construction projects. To our knowledge, several conceptual cost models have been developed with varying degrees of accuracy. Incorporating conceptual quantities into conceptual cost models could improve the accuracy of early predesign cost estimates. Hence, the development of quantity models for estimating conceptual quantities of framed reinforced concrete structures using supervised machine learning is the aim of the current research. Using measured quantities of structural elements and design variables such as live loads and soil bearing pressures, response and predictor variables were defined and used for constructing conceptual quantities models. Twenty-four models were developed for comparison using a combination of non-parametric support vector regression, linear regression, and bootstrap resampling techniques. R programming language was used for data analysis and model implementation. Gross soil bearing pressure and gross floor loading were discovered to have a major influence on the quantities of concrete and reinforcement used for foundations. Building footprint and gross floor loading had a similar influence on beams and slabs. Future research could explore the modelling of other conceptual quantities for walls, finishes, and services using machine learning techniques. Estimation of conceptual quantities would assist construction planners in early resource planning and enable detailed performance evaluation of early cost predictions.Keywords: bootstrapping, conceptual quantities, modelling, reinforced concrete, support vector regression
Procedia PDF Downloads 2071406 Determination of Nutritional Value and Steroidal Saponin of Fenugreek Genotypes
Authors: Anita Singh, Richa Naula, Manoj Raghav
Abstract:
Nutrient rich and high-yielding varieties of fenugreek can be developed by using genotypes which are naturally high in nutrients. Gene banks harbour scanty germplasm collection of Trigonella spp. and a very little background information about its genetic diversity. The extent of genetic diversity in a specific breeding population depends upon the genotype included in it. The present investigation aims at the estimation of macronutrient (phosphorus by spectrophotometer and potassium by flame photometer), micronutrients, namely, iron, zinc, manganese, and copper from seeds of fenugreek genotypes using atomic absorption spectrophotometer, protein by Rapid N Cube Analyser and Steroidal Saponins. Twenty-eight genotypes of fenugreek along with two standard checks, namely, Pant Ragini and Pusa Early Bunching were collected from different parts of India, and nutrient contents of each genotype were determined at G. B. P. U. A. & T. Laboratory, Pantnagar. Highest potassium content was observed in PFG-35 (1207 mg/100g). PFG-37 and PFG-20 were richest in phosphorus, iron and manganese content among all the genotypes. The lowest zinc content was found in PFG-26 (1.19 mg/100g), while the maximum zinc content was found in PFG- 28 (4.43 mg/100g). The highest content of copper was found in PFG-26 (1.97 mg/100g). PFG-39 has the highest protein content (29.60 %). Significant differences were observed in the steroidal saponin among the genotypes. Saponin content ranged from 0.38 g/100g to 1.31 g/100g. Steroidal Saponins content was found the maximum in PFG-36 (1.31 g/100g) followed by PFG-17 (1.28 g/100g). Therefore, the genotypes which are rich in nutrient and oil content can be used for plant biofortification, dietary supplements, and herbal products.Keywords: genotypes, macronutrients, micronutrient, protein, seeds
Procedia PDF Downloads 2561405 Development of pm2.5 Forecasting System in Seoul, South Korea Using Chemical Transport Modeling and ConvLSTM-DNN
Authors: Ji-Seok Koo, Hee‑Yong Kwon, Hui-Young Yun, Kyung-Hui Wang, Youn-Seo Koo
Abstract:
This paper presents a forecasting system for PM2.5 levels in Seoul, South Korea, leveraging a combination of chemical transport modeling and ConvLSTM-DNN machine learning technology. Exposure to PM2.5 has known detrimental impacts on public health, making its prediction crucial for establishing preventive measures. Existing forecasting models, like the Community Multiscale Air Quality (CMAQ) and Weather Research and Forecasting (WRF), are hindered by their reliance on uncertain input data, such as anthropogenic emissions and meteorological patterns, as well as certain intrinsic model limitations. The system we've developed specifically addresses these issues by integrating machine learning and using carefully selected input features that account for local and distant sources of PM2.5. In South Korea, the PM2.5 concentration is greatly influenced by both local emissions and long-range transport from China, and our model effectively captures these spatial and temporal dynamics. Our PM2.5 prediction system combines the strengths of advanced hybrid machine learning algorithms, convLSTM and DNN, to improve upon the limitations of the traditional CMAQ model. Data used in the system include forecasted information from CMAQ and WRF models, along with actual PM2.5 concentration and weather variable data from monitoring stations in China and South Korea. The system was implemented specifically for Seoul's PM2.5 forecasting.Keywords: PM2.5 forecast, machine learning, convLSTM, DNN
Procedia PDF Downloads 561404 An Argument for Agile, Lean, and Hybrid Project Management in Museum Conservation Practice: A Qualitative Evaluation of the Morris Collection Conservation Project at the Sainsbury Centre for Visual Arts
Authors: Maria Ledinskaya
Abstract:
This paper is part case study and part literature review. It seeks to introduce Agile, Lean, and Hybrid project management concepts from business, software development, and manufacturing fields to museum conservation by looking at their practical application on a recent conservation project at the Sainsbury Centre for Visual Arts. The author outlines the advantages of leaner and more agile conservation practices in today’s faster, less certain, and more budget-conscious museum climate where traditional project structures are no longer as relevant or effective. The Morris Collection Conservation Project was carried out in 2019-2021 in Norwich, UK, and concerned the remedial conservation of around 150 Abstract Constructivist artworks bequeathed to the Sainsbury Centre by private collectors Michael and Joyce Morris. It was a medium-sized conservation project of moderate complexity, planned and delivered in an environment with multiple known unknowns – unresearched collection, unknown conditions and materials, unconfirmed budget. The project was later impacted by the COVID-19 pandemic, introducing indeterminate lockdowns, budget cuts, staff changes, and the need to accommodate social distancing and remote communications. The author, then a staff conservator at the Sainsbury Centre who acted as project manager on the Morris Project, presents an incremental, iterative, and value-based approach to managing a conservation project in an uncertain environment. The paper examines the project from the point of view of Traditional, Agile, Lean, and Hybrid project management. The author argues that most academic writing on project management in conservation has focussed on a Traditional plan-driven approach – also known as Waterfall project management – which has significant drawbacks in today’s museum environment due to its over-reliance on prediction-based planning and its low tolerance to change. In the last 20 years, alternative Agile, Lean and Hybrid approaches to project management have been widely adopted in software development, manufacturing, and other industries, although their recognition in the museum sector has been slow. Using examples from the Morris Project, the author introduces key principles and tools of Agile, Lean, and Hybrid project management and presents a series of arguments on the effectiveness of these alternative methodologies in museum conservation, including the ethical and practical challenges to their implementation. These project management approaches are discussed in the context of consequentialist, relativist, and utilitarian developments in contemporary conservation ethics. Although not intentionally planned as such, the Morris Project had a number of Agile and Lean features which were instrumental to its successful delivery. These key features are identified as distributed decision-making, a co-located cross-disciplinary team, servant leadership, focus on value-added work, flexible planning done in shorter sprint cycles, light documentation, and emphasis on reducing procedural, financial, and logistical waste. Overall, the author’s findings point in favour of a hybrid model, which combines traditional and alternative project processes and tools to suit the specific needs of the project.Keywords: agile project management, conservation, hybrid project management, lean project management, waterfall project management
Procedia PDF Downloads 711403 Latent Factors of Severity in Truck-Involved and Non-Truck-Involved Crashes on Freeways
Authors: Shin-Hyung Cho, Dong-Kyu Kim, Seung-Young Kho
Abstract:
Truck-involved crashes have higher crash severity than non-truck-involved crashes. There have been many studies about the frequency of crashes and the development of severity models, but those studies only analyzed the relationship between observed variables. To identify why more people are injured or killed when trucks are involved in the crash, we must examine to quantify the complex causal relationship between severity of the crash and risk factors by adopting the latent factors of crashes. The aim of this study was to develop a structural equation or model based on truck-involved and non-truck-involved crashes, including five latent variables, i.e. a crash factor, environmental factor, road factor, driver’s factor, and severity factor. To clarify the unique characteristics of truck-involved crashes compared to non-truck-involved crashes, a confirmatory analysis method was used. To develop the model, we extracted crash data from 10,083 crashes on Korean freeways from 2008 through 2014. The results showed that the most significant variable affecting the severity of a crash is the crash factor, which can be expressed by the location, cause, and type of the crash. For non-truck-involved crashes, the crash and environment factors increase severity of the crash; conversely, the road and driver factors tend to reduce severity of the crash. For truck-involved crashes, the driver factor has a significant effect on severity of the crash although its effect is slightly less than the crash factor. The multiple group analysis employed to analyze the differences between the heterogeneous groups of drivers.Keywords: crash severity, structural structural equation modeling (SEM), truck-involved crashes, multiple group analysis, crash on freeway
Procedia PDF Downloads 3841402 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 3501401 Federalizing the Philippines: What Does It Mean for the Igorot Indigenous Peoples?
Authors: Shierwin Agagen Cabunilas
Abstract:
The unitary form of Philippine government has built a tradition of bureaucracy that strengthened oligarch and clientele politics. Consequently, the Philippines is lagged behind development. There is so much poverty, unemployment, and inadequate social services. In addition, it seems that the rights of national ethnic minority groups like the Igorots to develop their political and economic interests, linguistic and cultural heritage are neglected. Given these circumstances, a paradigm shift is inevitable. The author advocates a transition from a unitary to a federal system of government. Contrary to the notion that a unitary system facilitates better governance, it actually stifles it. As a unitary government, the Philippines seems (a) to exhibit incompetence in delivering efficient, necessary services to the people and (b) to exclude the minority from political participation and policy making. This shows that Philippine unitary system is highly centralized and operates from a top-bottom scheme. However, a federal system encourages decentralization, plurality and political participation. In my view, federalism is beneficial to the Philippine society and congenial to the Igorot indigenous peoples insofar as participative decision-making and development goals are concerned. This research employs critical and constructive analyses. The former interprets some complex practices of Philippine politics while the latter investigates how theories of federalism can be appropriated to deal with political deficits, ethnic diversity, and indigenous peoples’ rights to self-determination. The topic is developed accordingly: First, the author briefly examines the unitary structure of the Philippines and its impact on inter-governmental affairs and processes, asserting that bureaucracy and corruption, for example, are counterproductive to a participative political life, to economic development and to the recognition of national ethnic minorities. Second, he scrutinizes why federalism might transform this. Here, he assesses various opposing philosophical contentions on federal system in managing ethnically diverse society, like the Philippines, and argue that decentralization of political power, economic and cultural developments are reasons to exit from unitary government. Third, he suggests that federalism can be instrumental to Igorots self-determination. Self-determination is neither opposed to national development nor to the ideals of democracy – liberty, justice, solidarity. For example, as others have already noted, a politics in the vernacular facilitates greater participation among the people. Hence, there is a greater chance to arrive at policies that serve the interest of the people. Some may wary that decentralization disintegrates a nation. According to the author, however, the recognition of minority rights which includes self-determination may promote filial devotion to the state. If Igorot indigenous peoples have access to suitable institutions to determine their political life, economic goals, social needs, i.e., education, culture, language, chances are it moves the country forward to development fostering national unity. Remarkably, federal system thus best responds to the Philippines’s democratic and development deficits. Federalism can also significantly rectify the practices that oppress and dislocate national ethnic minorities as it ensures the creation of localized institutions for optimum political, economic, cultural determination and maximizes representation in the public sphere.Keywords: federalism, Igorot, indigenous peoples, self-determination
Procedia PDF Downloads 340