Search results for: root uptake models
7533 Co-Integration and Error Correction Mechanism of Supply Response of Sugarcane in Pakistan (1980-2012)
Authors: Himayatullah Khan
Abstract:
This study estimates supply response function of sugarcane in Pakistan from 1980-81 to 2012-13. The study uses co-integration approach and error correction mechanism. Sugarcane production, area and price series were tested for unit root using Augmented Dickey Fuller (ADF). The study found that these series were stationary at their first differenced level. Using the Augmented Engle-Granger test and Cointegrating Regression Durbin-Watson (CRDW) test, the study found that “production and price” and “area and price” were co-integrated suggesting that the two sets of time series had long-run or equilibrium relationship. The results of the error correction models for the two sets of series showed that there was disequilibrium in the short run there may be disequilibrium. The Engle-Granger residual may be thought of as the equilibrium error which can be used to tie the short-run behavior of the dependent variable to its long-run value. The Granger-Causality test results showed that log of price granger caused both the long of production and log of area whereas, the log of production and log of area Granger caused each other.Keywords: co-integration, error correction mechanism, Granger-causality, sugarcane, supply response
Procedia PDF Downloads 4357532 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance
Authors: Flora Babongo, Valerie Chavez
Abstract:
Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.Keywords: causal inference, DAGs, BAMLSS, financial index
Procedia PDF Downloads 1517531 RAPDAC: Role Centric Attribute Based Policy Driven Access Control Model
Authors: Jamil Ahmed
Abstract:
Access control models aim to decide whether a user should be denied or granted access to the user‟s requested activity. Various access control models have been established and proposed. The most prominent of these models include role-based, attribute-based, policy based access control models as well as role-centric attribute based access control model. In this paper, a novel access control model is presented called “Role centric Attribute based Policy Driven Access Control (RAPDAC) model”. RAPDAC incorporates the concept of “policy” in the “role centric attribute based access control model”. It leverages the concept of "policy‟ by precisely combining the evaluation of conditions, attributes, permissions and roles in order to allow authorization access. This approach allows capturing the "access control policy‟ of a real time application in a well defined manner. RAPDAC model allows making access decision at much finer granularity as illustrated by the case study of a real time library information system.Keywords: authorization, access control model, role based access control, attribute based access control
Procedia PDF Downloads 1597530 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes
Authors: Nadarajah I. Ramesh
Abstract:
Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model
Procedia PDF Downloads 2787529 Improving Quality of Family Planning Services in Pakistan
Authors: Mohammad Zakir, Saamia Shams
Abstract:
Background: Provision of quality family planning services remarkably contribute towards increased uptake of modern contraceptive methods and have important implications on reducing fertility rates. The quality of care in family planning has beneficial impact on reproductive health of women, yet little empirical evidence is present to show the relationship between the impact of adequate training of Community Mid Wives (CMW) and quality family planning services. Aim: This study aimed to enhance the knowledge and counseling skills of CMWs in improving the access to quality client-centered family planning services in Pakistan. Methodology: A quasi-experimental longitudinal study using Initial Quality Assurance Scores-Training-Post Training Quality Assurance Scores design with a non- equivalent control group was adopted to compare a set of experimental CMWs that received four days training package including Family Planning Methods, Counselling, Communication skills and Practical training on IUCD insertion with a set of comparison CMWs that did not receive any intervention. A sample size of 100 CMW from Suraj Social Franchise (SSF) private providers was recruited from both urban and rural Pakistan. Results: Significant improvement in the family planning knowledge and counseling skills (p< 0.001) of the CMWs was evident in the experimental group as compared to comparison group with p > 0.05. Non- significant association between pre-test level family planning knowledge and counseling skills was observed in both the groups (p>0.05). Conclusion: The findings demonstrate that adequate training is an important determinant of quality of family planning services received by clients. Provider level training increases the likelihood of contraceptives uptake and decreases the likelihood of both unintended and unwanted pregnancies. Enhancing quality of family planning services may significantly help reduce the fertility and improve the reproductive health indicators of women in Pakistan.Keywords: community mid wives, family planning services, quality of care, training
Procedia PDF Downloads 3407528 Gastronomy: The Preferred Digital Business Models and Impacts in Business Economics within Hospitality, Tourism, and Catering Sectors through Online Commerce
Authors: John Oupa Hlatshwayo
Abstract:
Background: There seem to be preferred digital business models with varying impacts within hospitality, tourism and catering sub-sectors explored through online commerce, as all are ingrained in the business economics domain. Aim: A study aims to establish if such phenomena (Digital Business Models) exist and to what extent if any, within the hospitality, tourism and catering industries, respectively. Setting: This is a qualitative study conducted by exploring several (Four) institutions globally through Case Studies. Method: This research explored explanatory case studies to answer questions about ‘how’ or ’why’ with little control by a researcher over the occurrence of events. It is qualitative research, deductive, and inductive methods. Hence, a comprehensive approach to analyzing qualitative data was attainable through immersion by reading to understand the information. Findings: The results corroborated the notion that digital business models are applicable, by and large, in business economics. Thus, three sectors wherein enterprises operate in the business economics sphere have been narrowed down i.e. hospitality, tourism and catering, are also referred to as triangular polygons due to the atypical nature of being ‘stand-alone’, yet ‘sub-sectors’, but there are confounding factors to consider. Conclusion: The significance of digital business models and digital transformation shows an inevitable merger between business and technology within Hospitality, Tourism, and Catering. Contribution: Such symbiotic relationship of business and technology, persistent evolution of clients’ interface with end-products, forever changing market, current adaptation as well as adjustment to ‘new world order’ by enterprises must be embraced constantly without fail by Business Practitioners, Academics, Business Students, Organizations and Governments.Keywords: digital business models, hospitality, tourism, catering, business economics
Procedia PDF Downloads 187527 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 1327526 Solid Lipid Nanoparticles of Levamisole Hydrochloride
Authors: Surendra Agrawal, Pravina Gurjar, Supriya Bhide, Ram Gaud
Abstract:
Levamisole hydrochloride is a prominent anticancer drug in the treatment of colon cancer but resulted in toxic effects due poor bioavailability and poor cellular uptake by tumor cells. Levamisole is an unstable drug. Incorporation of this molecule in solid lipids may minimize their exposure to the aqueous environment and partly immobilize the drug molecules within the lipid matrix-both of which may protect the encapsulated drugs against degradation. The objectives of the study were to enhance bioavailability by sustaining drug release and to reduce the toxicities associated with the therapy. Solubility of the drug was determined in different lipids to select the components of Solid Lipid Nanoparticles (SLN). Pseudoternary phase diagrams were created using aqueous titration method. Formulations were subjected to particle size and stability evaluation to select the final test formulations which were characterized for average particle size, zeta potential, and in-vitro drug release and percentage transmittance to optimize the final formulation. SLN of Levamisole hydrochloride was prepared by Nanoprecipitation method. Glyceryl behenate (Compritol 888 ATO) was used as core comprising of Tween 80 as surfactant and Lecithin as co-surfactant in (1:1) ratio. Entrapment efficiency (EE) was found to be 45.89%. Particle size was found in the range of 100-600 nm. Zeta potential of the formulation was -17.0 mV revealing the stability of the product. In-vitro release study showed that 66 % drug released in 24 hours in pH 7.2 which represent that formulation can give controlled action at the intestinal environment. In pH 5.0 it showed 64% release indicating that it can even release drug in acidic environment of tumor cells. In conclusion, results revealed SLN to be a promising approach to sustain the drug release so as to increase bioavailability and cellular uptake of the drug with reduction in toxic effects as dose has been reduced with controlled delivery.Keywords: SLN, nanoparticulate delivery of levamisole, pharmacy, pharmaceutical sciences
Procedia PDF Downloads 4317525 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 4097524 Using Traffic Micro-Simulation to Assess the Benefits of Accelerated Pavement Construction for Reducing Traffic Emissions
Authors: Sudipta Ghorai, Ossama Salem
Abstract:
Pavement maintenance, repair, and rehabilitation (MRR) processes may have considerable environmental impacts due to traffic disruptions associated with work zones. The simulation models in use to predict the emission of work zones were mostly static emission factor models (SEFD). SEFD calculates emissions based on average operation conditions e.g. average speed and type of vehicles. Although these models produce accurate results for large-scale planning studies, they are not suitable for analyzing driving conditions at the micro level such as acceleration, deceleration, idling, cruising, and queuing in a work zone. The purpose of this study is to prepare a comprehensive work zone environmental assessment (WEA) framework to calculate the emissions caused due to disrupted traffic; by integrating traffic microsimulation tools with emission models. This will help highway officials to assess the benefits of accelerated construction and opt for the most suitable TMP not only economically but also from an environmental point of view.Keywords: accelerated construction, pavement MRR, traffic microsimulation, congestion, emissions
Procedia PDF Downloads 4497523 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional
Procedia PDF Downloads 2297522 Performance Evaluation of Using Genetic Programming Based Surrogate Models for Approximating Simulation Complex Geochemical Transport Processes
Authors: Hamed K. Esfahani, Bithin Datta
Abstract:
Transport of reactive chemical contaminant species in groundwater aquifers is a complex and highly non-linear physical and geochemical process especially for real life scenarios. Simulating this transport process involves solving complex nonlinear equations and generally requires huge computational time for a given aquifer study area. Development of optimal remediation strategies in aquifers may require repeated solution of such complex numerical simulation models. To overcome this computational limitation and improve the computational feasibility of large number of repeated simulations, Genetic Programming based trained surrogate models are developed to approximately simulate such complex transport processes. Transport process of acid mine drainage, a hazardous pollutant is first simulated using a numerical simulated model: HYDROGEOCHEM 5.0 for a contaminated aquifer in a historic mine site. Simulation model solution results for an illustrative contaminated aquifer site is then approximated by training and testing a Genetic Programming (GP) based surrogate model. Performance evaluation of the ensemble GP models as surrogate models for the reactive species transport in groundwater demonstrates the feasibility of its use and the associated computational advantages. The results show the efficiency and feasibility of using ensemble GP surrogate models as approximate simulators of complex hydrogeologic and geochemical processes in a contaminated groundwater aquifer incorporating uncertainties in historic mine site.Keywords: geochemical transport simulation, acid mine drainage, surrogate models, ensemble genetic programming, contaminated aquifers, mine sites
Procedia PDF Downloads 2777521 Discrete Choice Modeling in Education: Evaluating Early Childhood Educators’ Practices
Authors: Michalis Linardakis, Vasilis Grammatikopoulos, Athanasios Gregoriadis, Kalliopi Trouli
Abstract:
Discrete choice models belong to the family of Conjoint analysis that are applied on the preferences of the respondents towards a set of scenarios that describe alternative choices. The scenarios have been pre-designed to cover all the attributes of the alternatives that may affect the choices. In this study, we examine how preschool educators integrate physical activities into their everyday teaching practices through the use of discrete choice models. One of the advantages of discrete choice models compared to other more traditional data collection methods (e.g. questionnaires and interviews that use ratings) is that the respondent is called to select among competitive and realistic alternatives, rather than objectively rate each attribute that the alternatives may have. We present the effort to construct and choose representative attributes that would cover all possible choices of the respondents, and the scenarios that have arisen. For the purposes of the study, we used a sample of 50 preschool educators in Greece that responded to 4 scenarios (from the total of 16 scenarios that the orthogonal design resulted), with each scenario having three alternative teaching practices. Seven attributes of the alternatives were used in the scenarios. For the analysis of the data, we used multinomial logit model with random effects, multinomial probit model and generalized mixed logit model. The conclusions drawn from the estimated parameters of the models are discussed.Keywords: conjoint analysis, discrete choice models, educational data, multivariate statistical analysis
Procedia PDF Downloads 4657520 Forecasting Model for Rainfall in Thailand: Case Study Nakhon Ratchasima Province
Authors: N. Sopipan
Abstract:
In this paper, we study of rainfall time series of weather stations in Nakhon Ratchasima province in Thailand using various statistical methods enabled to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. ARIMA and Holt-Winter models based on exponential smoothing were built. All the models proved to be adequate. Therefore, could give information that can help decision makers establish strategies for proper planning of agriculture, drainage system and other water resource applications in Nakhon Ratchasima province. We found the best perform for forecasting is ARIMA(1,0,1)(1,0,1)12.Keywords: ARIMA Models, exponential smoothing, Holt-Winter model
Procedia PDF Downloads 3007519 Metal Contents in Bird Feathers (Columba livia) from Mt Etna Volcano: Volcanic Plume Contribution and Biological Fractionation
Authors: Edda E. Falcone, Cinzia Federico, Sergio Bellomo, Lorenzo Brusca, Manfredi Longo, Walter D’Alessandro
Abstract:
Although trace metals are an essential element for living beings, they can become toxic at high concentrations. Their potential toxicity is related not only to the total content in the environment but mostly upon their bioavailability. Volcanoes are important natural metal emitters and they can deeply affect the quality of air, water and soils, as well as the human health. Trace metals tend to accumulate in the tissues of living organisms, depending on the metal contents in food, air and water and on the exposure time. Birds are considered as bioindicators of interest, because their feathers directly reflects the metals uptake from the blood. Birds are exposed to the atmospheric pollution through the contact with rainfall, dust, and aerosol, and they accumulate metals over the whole life cycle. We report on the first data combining the rainfall metal content in three different areas of Mt Etna, variably fumigated by the volcanic plume, and the metal contents in the feathers of pigeons, collected in the same areas. Rainfall samples were collected from three rain gauges placed at different elevation on the Eastern flank of the volcano, the most exposed to airborne plume, filtered, treated with HNO₃ Suprapur-grade and analyzed for Fe, Cr, Co, Ni, Se, Zn, Cu, Sr, Ba, Cd and As by ICP-MS technique, and major ions by ion chromatography. Feathers were collected from single individuals, in the same areas where the rain gauges were installed. Additionally, some samples were collected in an urban area, poorly interested by the volcanic plume. The samples were rinsed in MilliQ water and acetone, dried at 50°C until constant weight and digested in a mixture of 2:1 HNO₃ (65%) - H₂O₂ (30%) Suprapur-grade for 25-50 mg of sample, in a bath at near-to-boiling temperature. The solutions were diluted up to 20 ml prior to be analyzed by ICP-MS. The rainfall samples most contaminated by the plume were collected at close distance from the summit craters (less than 6 km), and show lower pH values and higher concentrations for all analyzed metals relative to those from the sites at lower elevation. Analyzed samples are enriched in both metals directly emitted by the volcanic plume and transported by acidic gases (SO₂, HCl, HF), and metals leached from the airborne volcanic ash. Feathers show different patterns in the different sites related to the exposure to natural or anthropogenic pollutants. They show abundance ratios similar to rainfall for lithophile elements (Ba, Sr), whereas are enriched in Zn and Se, known for their antioxidant properties, probably as adaptive response to oxidative stress induced by toxic metal exposure. The pigeons revealed a clear heterogeneity of metal uptake in the different parts of the volcano, as an effect of volcanic plume impact. Additionally, some physiological processes can modify the fate of some metals after uptake and this offer some insights for translational studies.Keywords: bioindicators, environmental pollution, feathers, trace metals, volcanic plume
Procedia PDF Downloads 1437518 Interoperability Maturity Models for Consideration When Using School Management Systems in South Africa: A Scoping Review
Authors: Keneilwe Maremi, Marlien Herselman, Adele Botha
Abstract:
The main purpose and focus of this paper are to determine the Interoperability Maturity Models to consider when using School Management Systems (SMS). The importance of this is to inform and help schools with knowing which Interoperability Maturity Model is best suited for their SMS. To address the purpose, this paper will apply a scoping review to ensure that all aspects are provided. The scoping review will include papers written from 2012-2019 and a comparison of the different types of Interoperability Maturity Models will be discussed in detail, which includes the background information, the levels of interoperability, and area for consideration in each Maturity Model. The literature was obtained from the following databases: IEEE Xplore and Scopus, the following search engines were used: Harzings, and Google Scholar. The topic of the paper was used as a search term for the literature and the term ‘Interoperability Maturity Models’ was used as a keyword. The data were analyzed in terms of the definition of Interoperability, Interoperability Maturity Models, and levels of interoperability. The results provide a table that shows the focus area of concern for each Maturity Model (based on the scoping review where only 24 papers were found to be best suited for the paper out of 740 publications initially identified in the field). This resulted in the most discussed Interoperability Maturity Model for consideration (Information Systems Interoperability Maturity Model (ISIMM) and Organizational Interoperability Maturity Model for C2 (OIM)).Keywords: interoperability, interoperability maturity model, school management system, scoping review
Procedia PDF Downloads 2097517 Models, Methods and Technologies for Protection of Critical Infrastructures from Cyber-Physical Threats
Authors: Ivan Župan
Abstract:
Critical infrastructure is essential for the functioning of a country and is designated for special protection by governments worldwide. Due to the increase in smart technology usage in every facet of the industry, including critical infrastructure, the exposure to malicious cyber-physical attacks has grown in the last few years. Proper security measures must be undertaken in order to defend against cyber-physical threats that can disrupt the normal functioning of critical infrastructure and, consequently the functioning of the country. This paper provides a review of the scientific literature of models, methods and technologies used to protect from cyber-physical threats in industries. The focus of the literature was observed from three aspects. The first aspect, resilience, concerns itself with the robustness of the system’s defense against threats, as well as preparation and education about potential future threats. The second aspect concerns security risk management for systems with cyber-physical aspects, and the third aspect investigates available testbed environments for testing developed models on scaled models of vulnerable infrastructure.Keywords: critical infrastructure, cyber-physical security, smart industry, security methodology, security technology
Procedia PDF Downloads 777516 Inhalable Lipid-Coated-Chitosan Nano-Embedded Microdroplets of an Antifungal Drug for Deep Lung Delivery
Authors: Ranjot Kaur, Om P. Katare, Anupama Sharma, Sarah R. Dennison, Kamalinder K. Singh, Bhupinder Singh
Abstract:
Respiratory microbial infections being among the top leading cause of death worldwide are difficult to treat as the microbes reside deep inside the airways, where only a small fraction of drug can access after traditional oral or parenteral routes. As a result, high doses of drugs are required to maintain drug levels above minimum inhibitory concentrations (MIC) at the infection site, unfortunately leading to severe systemic side-effects. Therefore, delivering antimicrobials directly to the respiratory tract provides an attractive way out in such situations. In this context, current study embarks on the systematic development of lung lia pid-modified chitosan nanoparticles for inhalation of voriconazole. Following the principles of quality by design, the chitosan nanoparticles were prepared by ionic gelation method and further coated with major lung lipid by precipitation method. The factor screening studies were performed by fractional factorial design, followed by optimization of the nanoparticles by Box-Behnken Design. The optimized formulation has a particle size range of 170-180nm, PDI 0.3-0.4, zeta potential 14-17, entrapment efficiency 45-50% and drug loading of 3-5%. The presence of a lipid coating was confirmed by FESEM, FTIR, and X-RD. Furthermore, the nanoparticles were found to be safe upto 40µg/ml on A549 and Calu-3 cell lines. The quantitative and qualitative uptake studies also revealed the uptake of nanoparticles in lung epithelial cells. Moreover, the data from Spraytec and next-generation impactor studies confirmed the deposition of nanoparticles in lower airways. Also, the interaction of nanoparticles with DPPC monolayers signifies its biocompatibility with lungs. Overall, the study describes the methodology and potential of lipid-coated chitosan nanoparticles in futuristic inhalation nanomedicine for the management of pulmonary aspergillosis.Keywords: dipalmitoylphosphatidylcholine, nebulization, DPPC monolayers, quality-by-design
Procedia PDF Downloads 1437515 Comparative Analysis of Effecting Factors on Fertility by Birth Order: A Hierarchical Approach
Authors: Ali Hesari, Arezoo Esmaeeli
Abstract:
Regarding to dramatic changes of fertility and higher order births during recent decades in Iran, access to knowledge about affecting factors on different birth orders has crucial importance. In this study, According to hierarchical structure of many of social sciences data and the effect of variables of different levels of social phenomena that determine different birth orders in 365 days ending to 1390 census have been explored by multilevel approach. In this paper, 2% individual row data for 1390 census is analyzed by HLM software. Three different hierarchical linear regression models are estimated for data analysis of the first and second, third, fourth and more birth order. Research results displays different outcomes for three models. Individual level variables entered in equation are; region of residence (rural/urban), age, educational level and labor participation status and province level variable is GDP per capita. Results show that individual level variables have different effects in these three models and in second level we have different random and fixed effects in these models.Keywords: fertility, birth order, hierarchical approach, fixe effects, random effects
Procedia PDF Downloads 3397514 An Analytical Survey of Construction Changes: Gaps and Opportunities
Authors: Ehsan Eshtehardian, Saeed Khodaverdi
Abstract:
This paper surveys the studies on construction change and reveals some of the potential future works. A full-scale investigation of change literature, including change definitions, types, causes and effects, and change management systems, is accomplished to explore some of the coming change trends. It is tried to pick up the critical works in each section to deduct a true timeline of construction changes. The findings show that leaping from best practice guides in late 1990s and generic process models in the early 2000s to very advanced modeling environments in the mid-2000s and the early 2010s have made gaps along with opportunities for change researchers in order to develop some more easy and applicable models. Another finding is that there is a compelling similarity between the change and risk prediction models. Therefore, integrating these two concepts, specifically from proactive management point of view, may lead to a synergy and help project teams avoid rework. Also, the findings show that exploitation of cause-effect relationship models, in order to facilitate the dispute resolutions, seems to be an interesting field for future works.Keywords: construction change, change management systems, dispute resolutions, change literature
Procedia PDF Downloads 2957513 Ground State Phases in Two-Mode Quantum Rabi Models
Authors: Suren Chilingaryan
Abstract:
We study two models describing a single two-level system coupled to two boson field modes in either a parallel or orthogonal setup. Both models may be feasible for experimental realization through Raman adiabatic driving in cavity QED. We study their ground state configurations; that is, we find the quantum precursors of the corresponding semi-classical phase transitions. We found that the ground state configurations of both models present the same critical coupling as the quantum Rabi model. Around this critical coupling, the ground state goes from the so-called normal configuration with no excitation, the qubit in the ground state and the fields in the quantum vacuum state, to a ground state with excitations, the qubit in a superposition of ground and excited state, while the fields are not in the vacuum anymore, for the first model. The second model shows a more complex ground state configuration landscape where we find the normal configuration mentioned above, two single-mode configurations, where just one of the fields and the qubit are excited, and a dual-mode configuration, where both fields and the qubit are excited.Keywords: quantum optics, quantum phase transition, cavity QED, circuit QED
Procedia PDF Downloads 3687512 Comparing the Gap Formation around Composite Restorations in Three Regions of Tooth Using Optical Coherence Tomography (OCT)
Authors: Rima Zakzouk, Yasushi Shimada, Yuan Zhou, Yasunori Sumi, Junji Tagami
Abstract:
Background and Purpose: Swept source optical coherence tomography (OCT) is an interferometric imaging technique that has been recently used in cariology. In spite of progress made in adhesive dentistry, the composite restoration has been failing due to secondary caries which occur due to environmental factors in oral cavities. Therefore, a precise assessment to effective marginal sealing of restoration is highly required. The aim of this study was evaluating gap formation at composite/cavity walls interface with or without phosphoric acid etching using SS-OCT. Materials and Methods: Round tapered cavities (2×2 mm) were prepared in three locations, mid-coronal, cervical, and root of bovine incisors teeth in two groups (SE and PA Groups). While self-etching adhesive (Clearfil SE Bond) was applied for the both groups, Group PA had been already pretreated with phosphoric acid etching (K-Etchant gel). Subsequently, both groups were restored by Estelite Flow Quick Flowable Composite Resin. Following 5000 thermal cycles, three cross-sectionals were obtained from each cavity using OCT at 1310-nm wavelength at 0°, 60°, 120° degrees. Scanning was repeated after two months to monitor the gap progress. Then the average percentage of gap length was calculated using image analysis software, and the difference of mean between both groups was statistically analyzed by t-test. Subsequently, the results were confirmed by sectioning and observing representative specimens under Confocal Laser Scanning Microscope (CLSM). Results: The results showed that pretreatment with phosphoric acid etching, Group PA, led to significantly bigger gaps in mid-coronal and cervical compared to SE group, while in the root cavity no significant difference was observed between both groups. On the other hand, the gaps formed in root’s cavities were significantly bigger than those in mid-coronal and cervical within the same group. This study investigated the effect of phosphoric acid on gap length progress on the composite restorations. In conclusions, phosphoric acid etching treatment did not reduce the gap formation even in different regions of the tooth. Significance: The cervical region of tooth was more exposing to gap formation than mid-coronal region, especially when we added pre-etching treatment.Keywords: image analysis, optical coherence tomography, phosphoric acid etching, self-etch adhesives
Procedia PDF Downloads 2217511 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: integral differential equations, jump–diffusion model, American options, rational approximation
Procedia PDF Downloads 1207510 Modelling and Optimization of Laser Cutting Operations
Authors: Hany Mohamed Abdu, Mohamed Hassan Gadallah, El-Giushi Mokhtar, Yehia Mahmoud Ismail
Abstract:
Laser beam cutting is one nontraditional machining process. This paper optimizes the parameters of Laser beam cutting machining parameters of Stainless steel (316L) by considering the effect of input parameters viz. power, oxygen pressure, frequency and cutting speed. Statistical design of experiments are carried in three different levels and process responses such as 'Average kerf taper (Ta)' and 'Surface Roughness (Ra)' are measured accordingly. A quadratic mathematical model (RSM) for each of the responses is developed as a function of the process parameters. Responses predicted by the models (as per Taguchi’s L27 OA) are employed to search for an optimal parametric combination to achieve desired yield of the process. RSM models are developed for mean responses, S/N ratio, and standard deviation of responses. Optimization models are formulated as single objective problem subject to process constraints. Models are formulated based on Analysis of Variance (ANOVA) using MATLAB environment. Optimum solutions are compared with Taguchi Methodology results.Keywords: optimization, laser cutting, robust design, kerf width, Taguchi method, RSM and DOE
Procedia PDF Downloads 6207509 Hydration Matters: Impact on 3 km Running Performance in Trained Male Athletes Under Heat Conditions
Authors: Zhaoqi He
Abstract:
Research Context: Endurance performance in hot environments is influenced by the interplay of hydration status and physiological responses. This study aims to investigate how dehydration, up to 2.11% body weight loss, affects the 3 km running performance of trained male athletes under conditions mimicking high temperatures. Methodology: In a randomized crossover design, five male athletes participated in two trials – euhydrated (EU) and dehydrated (HYPO). Both trials included a 70-minute preload run at 55-60% VO2max in 32°C and 50% humidity, followed by a 3-kilometer time trial. Fluid intake was restricted in HYPO to induce a 2.11% body weight loss. Physiological metrics, including heart rate, core temperature, and oxygen uptake, were measured, along with perceptual metrics like perceived exertion and thirst sensation. Findings: The 3-kilometer run completion times showed no significant differences between EU and HYPO trials (p=0.944). Physiological indicators, including heart rate, core temperature, and oxygen uptake, did not significantly vary (p>0.05). Thirst sensation was markedly higher in HYPO (p=0.013), confirming successful induction of dehydration. Other perceptual metrics and gastrointestinal comfort remained consistent. Conclusion: Contrary to the hypothesis, the study reveals that dehydration, inducing up to 2.11% body weight loss, does not significantly impair 3 km running performance in trained male athletes under hot conditions. Thirst sensation was notably higher in the dehydrated state, emphasizing the importance of considering perceptual factors in hydration strategies. The findings suggest that trained runners can maintain performance despite moderate dehydration, highlighting the need for nuanced hydration guidelines in hot-weather running.Keywords: hypohydration, euhydration, hot environment, 3km running time trial, endurance performance, trained athletes, perceptual metrics, dehydration impact, physiological responses, hydration strategies
Procedia PDF Downloads 667508 The Use of Performance Indicators for Evaluating Models of Drying Jackfruit (Artocarpus heterophyllus L.): Page, Midilli, and Lewis
Authors: D. S. C. Soares, D. G. Costa, J. T. S., A. K. S. Abud, T. P. Nunes, A. M. Oliveira Júnior
Abstract:
Mathematical models of drying are used for the purpose of understanding the drying process in order to determine important parameters for design and operation of the dryer. The jackfruit is a fruit with high consumption in the Northeast and perishability. It is necessary to apply techniques to improve their conservation for longer in order to diffuse it by regions with low consumption. This study aimed to analyse several mathematical models (Page, Lewis, and Midilli) to indicate one that best fits the conditions of convective drying process using performance indicators associated with each model: accuracy (Af) and noise factors (Bf), mean square error (RMSE) and standard error of prediction (% SEP). Jackfruit drying was carried out in convective type tray dryer at a temperature of 50°C for 9 hours. It is observed that the model Midili was more accurate with Af: 1.39, Bf: 1.33, RMSE: 0.01%, and SEP: 5.34. However, the use of the Model Midilli is not appropriate for purposes of control process due to need four tuning parameters. With the performance indicators used in this paper, the Page model showed similar results with only two parameters. It is concluded that the best correlation between the experimental and estimated data is given by the Page’s model.Keywords: drying, models, jackfruit, biotechnology
Procedia PDF Downloads 3797507 Investigations of Flow Field with Different Turbulence Models on NREL Phase VI Blade
Authors: T. Y. Liu, C. H. Lin, Y. M. Ferng
Abstract:
Wind energy is one of the clean renewable energy. However, the low frequency (20-200HZ) noise generated from the wind turbine blades, which bothers the residents, becomes the major problem to be developed. It is useful for predicting the aerodynamic noise by flow field and pressure distribution analysis on the wind turbine blades. Therefore, the main objective of this study is to use different turbulence models to analyse the flow field and pressure distributions of the wing blades. Three-dimensional Computation Fluid Dynamics (CFD) simulation of the flow field was used to calculate the flow phenomena for the National Renewable Energy Laboratory (NREL) Phase VI horizontal axis wind turbine rotor. Two different flow cases with different wind speeds were investigated: 7m/s with 72rpm and 15m/s with 72rpm. Four kinds of RANS-based turbulence models, Standard k-ε, Realizable k-ε, SST k-ω, and v2f, were used to predict and analyse the results in the present work. The results show that the predictions on pressure distributions with SST k-ω and v2f turbulence models have good agreements with experimental data.Keywords: horizontal axis wind turbine, turbulence model, noise, fluid dynamics
Procedia PDF Downloads 2657506 EDTA Assisted Phytoremediation of Cadmium by Enhancing Growth and Antioxidant Defense System in Brassica napus L.
Authors: Mujahid Farid, Shafaqat Ali, Muhammad Bilal Shakoor
Abstract:
Heavy metals pollution of soil is a prevalent global problem and oilseed rape (Brassica napus L.) are considered useful for the restoration of metal contaminated soils. Phytoextraction is an in-situ environment-friendly technique for the clean-up of contaminated soils. Response to cadmium (Cd) toxicity in combination with a chelator, Ethylenediamminetetraacetic acid (EDTA) was studied in oilseed rape grown hydroponically in greenhouse conditions under three levels of Cd (0, 10, and 50 µM) and two levels of EDTA (0 and 2.5 mM). Cd decreased plant growth, biomass and chlorophyll concentrations while the application of EDTA enhanced plant growth by reducing Cd-induced effects in Cd-stressed plants. Significant decrease in photosynthetic parameters was found by the Cd alone. Addition of EDTA improved the net photosynthetic and gas exchange capacity of plants under Cd stress. Cd at 10 and 50 μM significantly increased electrolyte leakage, the production of hydrogen peroxidase (H2O2) and malondialdehyde (MDA) and a significant reduction was observed in the activities of catalase (CAT), guaiacol peroxidase (POD), ascorbate peroxidase (APX), and superoxide dismutase under Cd stress plants. Application of EDTA at the rate of 2.5 mM alone and with combination of Cd increased the antioxidant enzymes activities and reduced the electrolyte leakage and production of H2O2 and MDA. Oilseed rape (Brassica napus L.) actively accumulated Cd in roots, stems and leaves and the addition of EDTA boosted the uptake and accumulation of Cd in oil seed rape by dissociating Cd in culture media. The present results suggest that under 8 weeks Cd-induced stress, application of EDTA significantly improve plant growth, chlorophyll content, photosynthetic, gas exchange capacity, improving enzymes activities and increased the metal uptake in roots, stems and leaves of oilseed rape (Brassica napus L.) respectively.Keywords: antioxidant enzymes, cadmium, chelator, EDTA, growth, oilseed rape
Procedia PDF Downloads 3927505 Effect of Silver Nanoparticles on Seed Germination of Crop Plants
Authors: Zainab M. Almutairi, Amjad Alharbi
Abstract:
The use of engineered nanomaterials has increased as a result of their positive impact on many sectors of the economy, including agriculture. Silver nanoparticles (AgNPs) are now used to enhance seed germination, plant growth, and photosynthetic quantum efficiency and as antimicrobial agents to control plant diseases. In this study, we examined the effect of AgNP dosage on the seed germination of three plant species: corn (Zea mays L.), watermelon (Citrullus lanatus [Thunb.] Matsum. & Nakai) and zucchini (Cucurbita pepo L.). This experiment was designed to study the effect of AgNPs on germination percentage, germination rate, mean germination time, root length and fresh and dry weight of seedlings for the three species. Seven concentrations (0.05, 0.1, 0.5, 1, 1.5, 2, and 2.5 mg/ml) of AgNPs were examined at the seed germination stage. The three species had different dose responses to AgNPs in terms of germination parameters and the measured growth characteristics. The germination rates of the three plants were enhanced in response to AgNPs. Significant enhancement of the germination percentage values was observed after treatment of the watermelon and zucchini plants with AgNPs in comparison with untreated seeds. AgNPs showed a toxic effect on corn root elongation, whereas watermelon and zucchini seedling growth were positively affected by certain concentrations of AgNPs. This study showed that exposure to AgNPs caused both positive and negative effects on plant growth and germination.Keywords: citrullus lanatus, cucurbita pepo, seed germination, seedling growth, silver nanoparticles, zea mays
Procedia PDF Downloads 3087504 Climate Change Effects on Agriculture
Authors: Abdellatif Chebboub
Abstract:
Agricultural production is sensitive to weather and thus directly affected by climate change. Plausible estimates of these climate change impacts require combined use of climate, crop, and economic models. Results from previous studies vary substantially due to differences in models, scenarios, and data. This paper is part of a collective effort to systematically integrate these three types of models. We focus on the economic component of the assessment, investigating how nine global economic models of agriculture represent endogenous responses to seven standardized climate change scenarios produced by two climate and five crop models. These responses include adjustments in yields, area, consumption, and international trade. We apply biophysical shocks derived from the Intergovernmental Panel on Climate Change’s representative concentration pathway with end-of-century radiative forcing of 8.5 W/m2. The mean biophysical yield effect with no incremental CO2 fertilization is a 17% reduction globally by 2050 relative to a scenario with unchanging climate. Endogenous economic responses reduce yield loss to 11%, increase area of major crops by 11%, and reduce consumption by 3%. Agricultural production, cropland area, trade, and prices show the greatest degree of variability in response to climate change, and consumption the lowest. The sources of these differences include model structure and specification; in particular, model assumptions about ease of land use conversion, intensification, and trade. This study identifies where models disagree on the relative responses to climate shocks and highlights research activities needed to improve the representation of agricultural adaptation responses to climate change.Keywords: climate change, agriculture, weather change, danger of climate change
Procedia PDF Downloads 316