Search results for: components of the NHIS
1700 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods
Authors: Vinayak Bassi, Rajpreet Singh
Abstract:
Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing
Procedia PDF Downloads 1621699 The Relationship Between Hourly Compensation and Unemployment Rate Using the Panel Data Regression Analysis
Authors: S. K. Ashiquer Rahman
Abstract:
the paper concentrations on the importance of hourly compensation, emphasizing the significance of the unemployment rate. There are the two most important factors of a nation these are its unemployment rate and hourly compensation. These are not merely statistics but they have profound effects on individual, families, and the economy. They are inversely related to one another. When we consider the unemployment rate that will probably decline as hourly compensations in manufacturing rise. But when we reduced the unemployment rates and increased job prospects could result from higher compensation. That’s why, the increased hourly compensation in the manufacturing sector that could have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, we use panel data regression models to evaluate the expected link between hourly compensation and unemployment rate in order to determine the effect of hourly compensation on unemployment rate. We estimate the fixed effects model, evaluate the error components, and determine which model (the FEM or ECM) is better by pooling all 60 observations. We then analysis and review the data by comparing 3 several countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of the extensive research on how the hourly compensation effects on the unemployment rate. Additionally, this paper offers relevant and useful informational to help the government and academic community use an econometrics and social approach to lessen on the effect of the hourly compensation on Unemployment rate to eliminate the problem.Keywords: hourly compensation, Unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model
Procedia PDF Downloads 811698 A Multi-Objective Gate Assignment Model Based on Airport Terminal Configuration
Authors: Seyedmirsajad Mokhtarimousavi, Danial Talebi, Hamidreza Asgari
Abstract:
Assigning aircrafts’ activities to appropriate gates is one the most challenging issues in airport authorities’ multiple criteria decision making. The potential financial loss due to imbalances of demand and supply in congested airports, higher occupation rates of gates, and the existing restrictions to expand facilities provide further evidence for the need for an optimal supply allocation. Passengers walking distance, towing movements, extra fuel consumption (as a result of awaiting longer to taxi when taxi conflicts happen at the apron area), etc. are the major traditional components involved in GAP models. In particular, the total cost associated with gate assignment problem highly depends on the airport terminal layout. The study herein presents a well-elaborated literature review on the topic focusing on major concerns, applicable variables and objectives, as well as proposing a three-objective mathematical model for the gate assignment problem. The model has been tested under different concourse layouts in order to check its performance in different scenarios. Results revealed that terminal layout pattern is a significant parameter in airport and that the proposed model is capable of dealing with key constraints and objectives, which supports its practical usability for future decision making tools. Potential solution techniques were also suggested in this study for future works.Keywords: airport management, terminal layout, gate assignment problem, mathematical modeling
Procedia PDF Downloads 2301697 Borate Crosslinked Fracturing Fluids: Laboratory Determination of Rheology
Authors: Lalnuntluanga Hmar, Hardik Vyas
Abstract:
Hydraulic fracturing has become an essential procedure to break apart the rock and release the oil or gas which are trapped tightly in the rock by pumping fracturing fluids at high pressure down into the well. To open the fracture and to transport propping agent along the fracture, proper selection of fracturing fluids is the most crucial components in fracturing operations. Rheology properties of the fluids are usually considered the most important. Among various fracturing fluids, Borate crosslinked fluids have proved to be highly effective. Borate in the form of Boric Acid, borate ion is the most commonly use to crosslink the hydrated polymers and to produce very viscous gels that can stable at high temperature. Guar and HPG (Hydroxypropyl Guar) polymers are the most often used in these fluids. Borate gel rheology is known to be a function of polymer concentration, borate ion concentration, pH, and temperature. The crosslinking using Borate is a function of pH which means it can be formed or reversed simply by altering the pH of the fluid system. The fluid system was prepared by mixing base polymer with water at pH ranging between 8 to 11 and the optimum borate crosslinker efficiency was found to be pH of about 10. The rheology of laboratory prepared Borate crosslinked fracturing fluid was determined using Anton Paar Rheometer and Fann Viscometer. The viscosity was measured at high temperature ranging from 200ᵒF to 250ᵒF and pressures in order to partially stimulate the downhole condition. Rheological measurements reported that the crosslinking increases the viscosity, elasticity and thus fluid capability to transport propping agent.Keywords: borate, crosslinker, Guar, Hydroxypropyl Guar (HPG), rheology
Procedia PDF Downloads 2021696 High Pressure Delignification Process for Nanocrystalline Cellulose Production from Agro-Waste Biomass
Authors: Sakinul Islam, Nhol Kao, Sati Bhattacharya, Rahul Gupta
Abstract:
Nanocrystalline cellulose (NCC) has been widely used for miscellaneous applications due to its superior properties over other nanomaterials. However, the major problems associated with the production of NCC are long reaction time, low production rate and inefficient process. The mass production of NCC within a short period of time is still a great challenge. The main objective of this study is to produce NCC from rice husk agro waste biomass from a high pressure delignification process (HPDP), followed by bleaching and hydrolysis processes. The HPDP has not been explored for NCC production from rice husk biomass (RHB) until now. In order to produce NCC, powder rice husk (PRH) was placed into a stainless steel reactor at 80 ˚C under 5 bars. Aqueous solution of NaOH (4M) was used for the dissolution of lignin and other amorphous impurities from PRH. After certain experimental times (1h, 3.5h and 6h), bleaching and hydrolysis were carried out on delignified samples. NaOCl (20%) and H2SO4 (4M) solutions were used for bleaching and hydrolysis processes, respectively. The NCC suspension from hydrolysis was sonicated and neutralized by buffer solution for various characterisations. Finally NCC suspension was dried and analyzed by FTIR, XRD, SEM, AFM and TEM. The chemical composition of NCC and PRH was estimated by TAPPI (Technical Association of Pulp and Paper Industry) standard methods to observe the product purity. It was found that, the 6h of the HPDP was more efficient to produce good quality NCC than that at 1h and 3.5h due to low separation of non-cellulosic components from RHB. The analyses indicated the crystallinity of NCC to be 71 %, particle size of 20-50 nm (diameter) and 100-200 nm in length.Keywords: nanocrystalline cellulose, NCC, high pressure delignification, bleaching, hydrolysis, agro-waste biomass
Procedia PDF Downloads 2641695 Development of a Symbiotic Milk Chocolate Using Inulin and Bifidobacterium Lactis
Authors: Guity Karim, Valiollah Ayareh
Abstract:
Probiotic dairy products are those that contain biologically active components that may affect beneficially one or more target functions in the body, beyond their adequate nutritional effects. As far as chocolate milk is a popular dairy product in the country especially among children and youth, production of a symbiotic (probiotic + peribiotic) new product using chocolate milk, Bifidobacterium lactis (DSM, Netherland) and inulin (Bene, Belgium) would help to promote the nutritional and functional properties of this product. Bifidobacterium Lactis is used as a probiotic in a variety of foods, particularly dairy products like yogurt and as a probiotic bacterium has benefit effects on the human health. Inulin as a peribiotic agent is considered as functional food ingredient. Experimental studies have shown its use as bifidogenic agent. Chocolate milk with different percent of fat (1 and 2 percent), 6 % of sugar and 0.9 % cacao was made, sterilized (UHT) and supplemented with Bifidobacterium lactis and inulin (0.5 %) after cooling . A sample was made without inulin as a control. Bifidobacterium lactis population was enumerated at days 0, 4, 8 and 12 together with measurement of pH, acidity and viscosity of the samples. Also sensory property of the product was evaluated by a 15 panel testers. The number of live bacterial cells was maintained at the functional level of 106-108 cfu/ml after keeping for 12 days in refrigerated temperature (4°C). Coliforms were found to be absent in the products during the storage. Chocolate milk containing 1% fat and inulin has the best effect on the survival and number of B. lactis at day 8 and after that. Moreover, the addition of inulin did not affect the sensorial quality of the product. In this work, chocolate has been evaluated as a potential protective carrier for oral delivery of B. lactis and inulin.Keywords: chocolate milk, synbiotic, bifidobacterium lactis, inulin
Procedia PDF Downloads 3601694 Feasibility Study of Friction Stir Welding Application for Kevlar Material
Authors: Ahmet Taşan, Süha Tirkeş, Yavuz Öztürk, Zafer Bingül
Abstract:
Friction stir welding (FSW) is a joining process in the solid state, which eliminates problems associated with the material melting and solidification, such as cracks, residual stresses and distortions generated during conventional welding. Among the most important advantages of FSW are; easy automation, less distortion, lower residual stress and good mechanical properties in the joining region. FSW is a recent approach to metal joining and although originally intended for aluminum alloys, it is investigated in a variety of metallic materials. The basic concept of FSW is a rotating tool, made of non-consumable material, specially designed with a geometry consisting of a pin and a recess (shoulder). This tool is inserted as spinning on its axis at the adjoining edges of two sheets or plates to be joined and then it travels along the joining path line. The tool rotation axis defines an angle of inclination with which the components to be welded. This angle is used for receiving the material to be processed at the tool base and to promote the gradual forge effect imposed by the shoulder during the passage of the tool. This prevents the material plastic flow at the tool lateral, ensuring weld closure on the back of the pin. In this study, two 4 mm Kevlar® plates which were produced with the Kevlar® fabrics, are analyzed with COMSOL Multiphysics in order to investigate the weldability via FSW. Thereafter, some experimental investigation is done with an appropriate workbench in order to compare them with the analysis results.Keywords: analytical modeling, composite materials welding, friction stir welding, heat generation
Procedia PDF Downloads 1591693 Estimation of Genetic Diversity in Sorghum Accessions Using Agro-Mophological and Nutritional Traits
Authors: Maletsema Alina Mofokeng, Nemera Shargie
Abstract:
Sorghum is one of the most important cereal crops grown as a source of calories for many people in tropics and sub-tropics of the world. Proper characterisation and evaluation of crop germplasm is an important component for effective management of genetic resources and their utilisation in the improvement of the crop through plant breeding. The objective of the study was to estimate the genetic diversity present in sorghum accessions grown in South Africa using agro-morphological traits and some nutritional contents. The experiment was carried out in Potchefstroom. Data were subjected to correlations, principal components analysis, and hierarchical clustering using GenStat statistical software. There were highly significance differences among the accessions based on agro-morphological and nutritional quality traits. Grain yield was highly positively correlated with panicle weight. Plant height was highly significantly correlated with internode length, leaf length, leaf number, stem diameter, the number of nodes and starch content. The Principal component analysis revealed three most important PCs with a total variation of 78.6%. The protein content ranged from 7.7 to 14.7%, and starch ranged from 58.52 to 80.44%. The accessions that had high protein and starch content were AS16cyc and MP4277. There was vast genetic diversity observed among the accessions assessed that can be used by plant breeders to improve yield and nutritional traits.Keywords: accessions, genetic diversity, nutritional quality, sorghum
Procedia PDF Downloads 2631692 Critical Design Futures: A Foresight 3.0 Approach to Business Transformation and Innovation
Authors: Nadya Patel, Jawn Lim
Abstract:
Foresight 3.0 is a synergistic methodology that encompasses systems analysis, future studies, capacity building, and forward planning. These components are interconnected, fostering a collective anticipatory intelligence that promotes societal resilience (Ravetz, 2020). However, traditional applications of these strands can often fall short, leading to missed opportunities and narrow perspectives. Therefore, Foresight 3.0 champions a holistic approach to tackling complex issues, focusing on systemic transformations and power dynamics. Businesses are pivotal in preparing the workforce for an increasingly uncertain and complex world. This necessitates the adoption of innovative tools and methodologies, such as Foresight 3.0, that can better equip young employees to anticipate and navigate future challenges. Firstly, the incorporation of its methodology into workplace training can foster a holistic perspective among employees. This approach encourages employees to think beyond the present and consider wider social, economic, and environmental contexts, thereby enhancing their problem-solving skills and resilience. This paper discusses our research on integrating Foresight 3.0's transformative principles with a newly developed Critical Design Futures (CDF) framework to equip organisations with the ability to innovate for the world's most complex social problems. This approach is grounded in 'collective forward intelligence,' enabling mutual learning, co-innovation, and co-production among a diverse stakeholder community, where business transformation and innovation are achieved.Keywords: business transformation, innovation, foresight, critical design
Procedia PDF Downloads 811691 Compact LWIR Borescope Sensor for Surface Temperature of Engine Components
Authors: Andy Zhang, Awnik Roy, Trevor B. Chen, Bibik Oleksandr, Subodh Adhikari, Paul S. Hsu
Abstract:
The durability of a combustor in gas-turbine enginesrequiresa good control of its component temperatures. Since the temperature of combustion gases frequently exceeds the melting point of the combustion liner walls, an efficient air-cooling system is significantly important to elongatethe lifetime of liner walls. To determine the effectiveness of the air-cooling system, accurate 2D surface temperature measurement of combustor liner walls is crucial for advanced engine development. Traditional diagnostic techniques for temperature measurement, such as thermocouples, thermal wall paints, pyrometry, and phosphors, have shown disadvantages, including being intrusive and affecting local flame/flow dynamics, potential flame quenching, and physical damages to instrumentation due to harsh environments inside the combustor and strong optical interference from strong combustion emission in UV-Mid IR wavelength. To overcome these drawbacks, a compact and small borescope long-wave-infrared (LWIR) sensor is developed to achieve two-dimensional high-spatial resolution, high-fidelity thermal imaging of 2D surface temperature in gas-turbine engines, providing the desired engine component temperature distribution. The compactLWIRborescope sensor makes it feasible to promote the durability of combustor in gas-turbine engines.Keywords: borescope, engine, long-wave-infrared, sensor
Procedia PDF Downloads 1381690 The Effect of Mathematical Modeling of Damping on the Seismic Energy Demands
Authors: Selamawit Dires, Solomon Tesfamariam, Thomas Tannert
Abstract:
Modern earthquake engineering and design encompass performance-based design philosophy. The main objective in performance-based design is to achieve a system performing precisely to meet the design objectives so to reduce unintended seismic risks and associated losses. Energy-based earthquake-resistant design is one of the design methodologies that can be implemented in performance-based earthquake engineering. In energy-based design, the seismic demand is usually described as the ratio of the hysteretic to input energy. Once the hysteretic energy is known as a percentage of the input energy, it is distributed among energy-dissipating components of a structure. The hysteretic to input energy ratio is highly dependent on the inherent damping of a structural system. In numerical analysis, damping can be modeled as stiffness-proportional, mass-proportional, or a linear combination of stiffness and mass. In this study, the effect of mathematical modeling of damping on the estimation of seismic energy demands is investigated by considering elastic-perfectly-plastic single-degree-of-freedom systems representing short to long period structures. Furthermore, the seismicity of Vancouver, Canada, is used in the nonlinear time history analysis. According to the preliminary results, the input energy demand is not sensitive to the type of damping models deployed. Hence, consistent results are achieved regardless of the damping models utilized in the numerical analyses. On the other hand, the hysteretic to input energy ratios vary significantly for the different damping models.Keywords: damping, energy-based seismic design, hysteretic energy, input energy
Procedia PDF Downloads 1681689 Steam Reforming of Acetic Acid over Microwave-Synthesized Ce0.75Zr0.25O2 Supported Ni Catalysts
Authors: Panumard Kaewmora, Thirasak Rirksomboon, Vissanu Meeyoo
Abstract:
Due to the globally growing demands of petroleum fuel and fossil fuels, the scarcity or even depletion of fossil fuel sources could be inevitable. Alternatively, the utilization of renewable sources, such as biomass, has become attractive to the community. Biomass can be converted into bio-oil by fast pyrolysis. In water phase of bio-oil, acetic acid which is one of its main components can be converted to hydrogen with high selectivity over effective catalysts in steam reforming process. Steam reforming of acetic acid as model compound has been intensively investigated for hydrogen production using various metal oxide supported nickel catalysts and yet they seem to be rapidly deactivated depending on the support utilized. A catalyst support such as Ce1-xZrxO2 mixed oxide was proposed for alleviating this problem with the anticipation of enhancing hydrogen yield. However, catalyst preparation methods play a significant role in catalytic activity and performance of the catalysts. In this work, Ce0.75Zr0.25O2 mixed oxide solid solution support was prepared by urea hydrolysis using microwave as heat source. After that nickel metal was incorporated at 15 wt% by incipient wetness impregnation method. The catalysts were characterized by several techniques including BET, XRD, H2-TPR, XRF, SEM, and TEM as well as tested for the steam reforming of acetic acid at various operating conditions. Preliminary results showed that a hydrogen yield of ca. 32% with a relatively high acetic conversion was attained at 650°C.Keywords: acetic acid, steam reforming, microwave, nickel, ceria, zirconia
Procedia PDF Downloads 1741688 Investigating Dynamic Transition Process of Issues Using Unstructured Text Analysis
Authors: Myungsu Lim, William Xiu Shun Wong, Yoonjin Hyun, Chen Liu, Seongi Choi, Dasom Kim, Namgyu Kim
Abstract:
The amount of real-time data generated through various mass media has been increasing rapidly. In this study, we had performed topic analysis by using the unstructured text data that is distributed through news article. As one of the most prevalent applications of topic analysis, the issue tracking technique investigates the changes of the social issues that identified through topic analysis. Currently, traditional issue tracking is conducted by identifying the main topics of documents that cover an entire period at the same time and analyzing the occurrence of each topic by the period of occurrence. However, this traditional issue tracking approach has limitation that it cannot discover dynamic mutation process of complex social issues. The purpose of this study is to overcome the limitations of the existing issue tracking method. We first derived core issues of each period, and then discover the dynamic mutation process of various issues. In this study, we further analyze the mutation process from the perspective of the issues categories, in order to figure out the pattern of issue flow, including the frequency and reliability of the pattern. In other words, this study allows us to understand the components of the complex issues by tracking the dynamic history of issues. This methodology can facilitate a clearer understanding of complex social phenomena by providing mutation history and related category information of the phenomena.Keywords: Data Mining, Issue Tracking, Text Mining, topic Analysis, topic Detection, Trend Detection
Procedia PDF Downloads 4081687 The Relevance of the U-Shaped Learning Model to the Acquisition of the Difference between C'est and Il Est in the English Learners of French Context
Authors: Pooja Booluck
Abstract:
A U-shaped learning curve entails a three-step process: a good performance followed by a bad performance followed by a good performance again. U-shaped curves have been observed not only in language acquisition but also in various fields such as temperature face recognition object permanence to name a few. Building on previous studies of the curve child language acquisition and Second Language Acquisition this empirical study seeks to investigate the relevance of the U-shaped learning model to the acquisition of the difference between cest and il est in the English Learners of French context. The present study was developed to assess whether older learners of French in the ELF context follow the same acquisition pattern. The empirical study was conducted on 15 English learners of French which lasted six weeks. Compositions and questionnaires were collected from each subject at three time intervals (after one week after three weeks after six weeks) after which students work were graded as being either correct or incorrect. The data indicates that there is evidence of a U-shaped learning curve in the acquisition of cest and il est and students did follow the same acquisition pattern as children in regards to rote-learned terms and subject clitics. This paper also discusses the need to introduce modules on U-shaped learning curve in teaching curriculum as many teachers are unaware of the trajectory learners undertake while acquiring core components in grammar. In addition this study also addresses the need to conduct more research on the acquisition of rote-learned terms and subject clitics in SLA.Keywords: child language acquisition, rote-learning, subject clitics, u-shaped learning model
Procedia PDF Downloads 2931686 Influence of Synergistic/Antagonistic Mixtures of Oligomeric Stabilizers on the Biodegradation of γ-Sterilized Polyolefins
Authors: Sameh A. S. Thabit Alariqi
Abstract:
Our previous studies aimed to investigate the biodegradation of γ-sterilized polyolefins in composting and microbial culture environments at different doses and γ-dose rates. It was concluded from the previous studies that the pretreatment of γ-irradiation can accelerate the biodegradation of neat polymer matrix in biotic conditions significantly. A similar work was carried out to study the stabilization of γ-sterilized polyolefins using different mixtures of stabilizers which are approved for food-contact applications. Ethylene-propylene (EP) copolymer has been melt-mixed with hindered amine stabilizers (HAS), phenolic antioxidants and hydroperoxide decomposers. Results were discussed by comparing the stabilizing efficiency, combination and consumption of stabilizers and the synergistic and antagonistic effects was explained through the interaction between the stabilizers. In this attempt, we have aimed to study the influence of the synergistic and antagonistic mixtures of oligomeric stabilizers on the biodegradation of the γ-irradiated polyolefins in composting and microbial culture. Neat and stabilized films of EP copolymer irradiated under γ-radiation and incubated in compost and fungal culture environments. The changes in functional groups, surface morphology, mechanical properties and intrinsic viscosity in polymer chains were characterized by FT-IR spectroscopy, SEM, instron, and viscometric measurements respectively. Results were discussed by comparing the effect of different stabilizers, stabilizers mixtures on the biodegradation of the γ-irradiated polyolefins. It was found that the biodegradation significantly depends on the components of stabilization system, mobility, interaction, and consumption of stabilizers.Keywords: biodegradation, γ-irradiation, polyolefins, stabilization
Procedia PDF Downloads 3881685 Arsenic Removal from Drinking Water by Hybrid Hydrogel-Biochar Matrix: An Understanding of Process Parameters
Authors: Vibha Sinha, Sumedha Chakma
Abstract:
Arsenic (As) contamination in drinking water is a serious concern worldwide resulting in severe health maladies. To tackle this problem, several hydrogel based matrix which selectively uptake toxic metals from contaminated water has increasingly been examined as a potential practical method for metal removal. The major concern in hydrogels is low stability of matrix, resulting in poor performance. In this study, the potential of hybrid hydrogel-biochar matrix synthesized from natural plant polymers, specific for As removal was explored. Various compositional and functional group changes of the elements contained in the matrix due to the adsorption of As were identified. Moreover, to resolve the stability issue in hydrogel matrix, optimum and effective mixing of hydrogel with biochar was studied. Mixing varied proportions of matrix components at the time of digestion process was tested. Preliminary results suggest that partial premixing methods may increase the stability and reduce cost. Addition of nanoparticles and specific catalysts with different concentrations of As(III) and As(V) under batch conditions was performed to study their role in performance enhancement of the hydrogel matrix. Further, effect of process parameters, optimal uptake conditions and detailed mechanism derived from experimental studies were suitably conducted. This study provides an efficient, specific and a low-cost As removal method that offers excellent regeneration abilities which can be reused for value.Keywords: arsenic, catalysts, hybrid hydrogel-biochar, water purification
Procedia PDF Downloads 1911684 Use of Front-Face Fluorescence Spectroscopy and Multiway Analysis for the Prediction of Olive Oil Quality Features
Authors: Omar Dib, Rita Yaacoub, Luc Eveleigh, Nathalie Locquet, Hussein Dib, Ali Bassal, Christophe B. Y. Cordella
Abstract:
The potential of front-face fluorescence coupled with chemometric techniques, namely parallel factor analysis (PARAFAC) and multiple linear regression (MLR) as a rapid analysis tool to characterize Lebanese virgin olive oils was investigated. Fluorescence fingerprints were acquired directly on 102 Lebanese virgin olive oil samples in the range of 280-540 nm in excitation and 280-700 nm in emission. A PARAFAC model with seven components was considered optimal with a residual of 99.64% and core consistency value of 78.65. The model revealed seven main fluorescence profiles in olive oil and was mainly associated with tocopherols, polyphenols, chlorophyllic compounds and oxidation/hydrolysis products. 23 MLR regression models based on PARAFAC scores were generated, the majority of which showed a good correlation coefficient (R > 0.7 for 12 predicted variables), thus satisfactory prediction performances. Acid values, peroxide values, and Delta K had the models with the highest predictions, with R values of 0.89, 0.84 and 0.81 respectively. Among fatty acids, linoleic and oleic acids were also highly predicted with R values of 0.8 and 0.76, respectively. Factors contributing to the model's construction were related to common fluorophores found in olive oil, mainly chlorophyll, polyphenols, and oxidation products. This study demonstrates the interest of front-face fluorescence as a promising tool for quality control of Lebanese virgin olive oils.Keywords: front-face fluorescence, Lebanese virgin olive oils, multiple Linear regressions, PARAFAC analysis
Procedia PDF Downloads 4531683 Grating Assisted Surface Plasmon Resonance Sensor for Monitoring of Hazardous Toxic Chemicals and Gases in an Underground Mines
Authors: Sanjeev Kumar Raghuwanshi, Yadvendra Singh
Abstract:
The objective of this paper is to develop and optimize the Fiber Bragg (FBG) grating based Surface Plasmon Resonance (SPR) sensor for monitoring the hazardous toxic chemicals and gases in underground mines or any industrial area. A fully cladded telecommunication standard FBG is proposed to develop to produce surface plasmon resonance. A thin few nm gold/silver film (subject to optimization) is proposed to apply over the FBG sensing head using e-beam deposition method. Sensitivity enhancement of the sensor will be done by adding a composite nanostructured Graphene Oxide (GO) sensing layer using the spin coating method. Both sensor configurations suppose to demonstrate high responsiveness towards the changes in resonance wavelength. The GO enhanced sensor may show increased sensitivity of many fold compared to the gold coated traditional fibre optic sensor. Our work is focused on to optimize GO, multilayer structure and to develop fibre coating techniques that will serve well for sensitive and multifunctional detection of hazardous chemicals. This research proposal shows great potential towards future development of optical fiber sensors using readily available components such as Bragg gratings as highly sensitive chemical sensors in areas such as environmental sensing.Keywords: surface plasmon resonance, fibre Bragg grating, sensitivity, toxic gases, MATRIX method
Procedia PDF Downloads 2681682 Markowitz and Implementation of a Multi-Objective Evolutionary Technique Applied to the Colombia Stock Exchange (2009-2015)
Authors: Feijoo E. Colomine Duran, Carlos E. Peñaloza Corredor
Abstract:
There modeling component selection financial investment (Portfolio) a variety of problems that can be addressed with optimization techniques under evolutionary schemes. For his feature, the problem of selection of investment components of a dichotomous relationship between two elements that are opposed: The Portfolio Performance and Risk presented by choosing it. This relationship was modeled by Markowitz through a media problem (Performance) - variance (risk), ie must Maximize Performance and Minimize Risk. This research included the study and implementation of multi-objective evolutionary techniques to solve these problems, taking as experimental framework financial market equities Colombia Stock Exchange between 2009-2015. Comparisons three multiobjective evolutionary algorithms, namely the Nondominated Sorting Genetic Algorithm II (NSGA-II), the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and Indicator-Based Selection in Multiobjective Search (IBEA) were performed using two measures well known performance: The Hypervolume indicator and R_2 indicator, also it became a nonparametric statistical analysis and the Wilcoxon rank-sum test. The comparative analysis also includes an evaluation of the financial efficiency of the investment portfolio chosen by the implementation of various algorithms through the Sharpe ratio. It is shown that the portfolio provided by the implementation of the algorithms mentioned above is very well located between the different stock indices provided by the Colombia Stock Exchange.Keywords: finance, optimization, portfolio, Markowitz, evolutionary algorithms
Procedia PDF Downloads 3021681 How Validated Nursing Workload and Patient Acuity Data Can Promote Sustained Change and Improvements within District Health Boards. the New Zealand Experience
Authors: Rebecca Oakes
Abstract:
In the New Zealand public health system, work has been taking place to use electronic systems to convey data from the ‘floor to the board’ that makes patient needs, and therefore nursing work, visible. For nurses, these developments in health information technology puts us in a very new and exciting position of being able to articulate the work of nursing through a language understood at all levels of an organisation, the language of acuity. Nurses increasingly have a considerable stake-hold in patient acuity data. Patient acuity systems, when used well, can assist greatly in demonstrating how much work is required, the type of work, and when it will be required. The New Zealand Safe Staffing Unit is supporting New Zealand nurses to create a culture of shared governance, where nursing data is informing policies, staffing methodologies and forecasting within their organisations. Assisting organisations to understand their acuity data, strengthening user confidence in using electronic patient acuity systems, and ensuring nursing and midwifery workload is accurately reflected is critical to the success of the safe staffing programme. Nurses and midwives have the capacity via an acuity tool to become key informers of organisational planning. Quality patient care, best use of health resources and a quality work environment are essential components of a safe, resilient and well resourced organisation. Nurses are the key informers of this information. In New Zealand a national level approach is paving the way for significant changes to the understanding and use of patient acuity and nursing workload information.Keywords: nursing workload, patient acuity, safe staffing, New Zealand
Procedia PDF Downloads 3821680 The Effectiveness of Treating Anxiety with Reiki
Authors: Erika Humphreys
Abstract:
The effectiveness of treating anxiety with Reiki is explored within ten quantitative studies. The methodology utilized for a critical appraisal and systematic review of the literature is explained with inclusion and exclusion criteria. The theoretical framework for the project is grounded in the work of Hildegard Peplau, whose nursing theory based on the therapeutic use of self is foundational for Reiki implementation. A thorough critique of the literature is conducted for key components of robustness and believability. This critique is conducted using a structured guide addressing synthesized strengths and weaknesses of the body of literature. A synthesis of the literature explores the findings of the studies. This synthesis reports on Reiki’s effectiveness in treating anxiety within a variety of patient settings and populations, its effect on subscales of anxiety, physiological manifestations of anxiety, and pain associated with anxiety. Cultural considerations affecting Reiki’s potential effectiveness are discussed. Gaps in the literature are examined, including the studies’ narrow sample population, lack of participant exclusionary factors for controlled outcome data, and the lack of studies across time. Implications for future research are discussed with recommendations for expanded research that includes a broader variety of settings, age groups, and patient diagnoses, including anxiety disorders, for research data that is transferable. Implications for further practice for the advanced practice registered nurse (APRN) are explored, with the potential benefits for both providers and patients, including improved patient satisfaction and expansion of provider treatment modalities.Keywords: Reiki, anxiety, complementary alternative medicine, pandemic
Procedia PDF Downloads 1641679 Mediating and Moderating Function of Corporate Governance on Firm Tax Planning and Firm Tax Disclosure Relationship
Authors: Mahfoudh Hussein Mgammal
Abstract:
The purpose of this paper is to investigate the moderating and mediating effect of corporate governance mechanisms proxy on the relationship of tax planning measured by effective tax rate components and tax disclosure. This paper tested the hypotheses by a 3-step hierarchical regression with 2010 to 2012 Malaysian-listed nonfinancial firms. We found companies positively value tax-planning activities. This indicates that tax planning is seen as a source of companies' wealth creation as the results show that there is an association between the tax disclosure and the extent of tax planning, and this relationship is highly significant. Examination of the implications of corporate governance mechanisms on the tax disclosure-tax planning association showed the lack of a significant coefficient related to any of the interactive variables. This makes it hard to understand the nature of the association. Finally, we further study the sensitivity of the results, the outcomes were also examined for the robustness and strength of the model specification utilizing OLS-effect estimators and the absence of tax planning related factors (GRTH, LEVE, and CAPNT). The findings of these tests display there is no effect on the tax planning-tax disclosure association. The outcomes of the annual regressions test show that the panel regressions results differ over time because there is a time difference impact on the associations, and the different models are not completely proportionate as a whole. Moreover, our paper lends some support to recent theory on the importance of taxes to corporate governance by demonstrating how the agency costs of tax planning allow certain shareholders to benefit from firm activities at the expense of others.Keywords: tax disclosure, tax planning, corporate governance, effective tax rate
Procedia PDF Downloads 1511678 Second Generation Biofuels: A Futuristic Green Deal for Lignocellulosic Waste
Authors: Nivedita Sharma
Abstract:
The global demand for fossil fuels is very high, but their use is not sustainable since its reserves are declining. Additionally, fossil fuels are responsible for the accumulation of greenhouse gases. The emission of greenhouse gases from the transport sector can be reduced by substituting fossil fuels by biofuels. Thus, renewable fuels capable of sequestering carbon dioxide are in high demand. Second‐generation biofuels, which require lignocellulosic biomass as a substrate and ultimately producing ethanol, fall largely in this category. Bioethanol is a favorable and near carbon-neutral renewable biofuel leading to reduction in tailpipe pollutant emission and improving the ambient air quality. Lignocellulose consists of three main components: cellulose, hemicellulose and lignin which can be converted to ethanol with the help of microbial enzymes. Enzymatic hydrolysis of lignocellulosic biomass in 1st step is considered as the most efficient and least polluting methods for generating fermentable hexose and pentose sugars which subsequently are fermented to power alcohol by yeasts in 2nd step of the process. In the present technology, a complete bioconversion process i.e. potential hydrolytic enzymes i.e. cellulase and xylanase producing microorganisms have been isolated from different niches, screened for enzyme production, identified using phenotyping and genotyping, enzyme production, purification and application of enzymes for saccharification of different lignocellulosic biomass followed by fermentation of hydrolysate to ethanol with high yield is to be presented in detail.Keywords: cellulase, xylanase, lignocellulose, bioethanol, microbial enzymes
Procedia PDF Downloads 981677 Catalytic Pyrolysis of Barley Straw for the Production of Fuels and Chemicals
Authors: Funda Ates
Abstract:
Primary energy sources, such as petroleum, coal and natural gas are principle responsible of world’s energy consumption. However, the rapid worldwide increase in the depletion of these energy sources is remarkable. In addition to this, they have damaging environmentally effect. Renewable energy sources are capable of providing a considerable fraction of World energy demand in this century. Biomass is one of the most abundant and utilized sources of renewable energy in the world. It can be converted into commercial fuels, suitable to substitute for fossil fuels. A high number of biomass types can be converted through thermochemical processes into solid, liquid or gaseous fuels. Pyrolysis is the thermal decomposition of biomass in the absence of air or oxygen. In this study, barley straw has been investigated as an alternative feedstock to obtain fuels and chemicals via pyrolysis in fixed-bed reactor. The influence of pyrolysis temperature in the range 450–750 °C as well as the catalyst effects on the products was investigated and the obtained results were compared. The results indicated that a maximum oil yield of 20.4% was obtained at a moderate temperature of 550 °C. Oil yield decreased by using catalyst. Pyrolysis oils were examined by using instrumental analysis and GC/MS. Analyses revealed that the pyrolysis oils were chemically very heterogeneous at all temperatures. It was determined that the most abundant compounds composing the bio-oil were phenolics. Catalyst decreased the reaction temperature. Most of the components obtained using a catalyst at moderate temperatures was close to those obtained at high temperatures without using a catalyst. Moreover, the use of a catalyst also decreased the amount of oxygenated compounds produced.Keywords: Barley straw, pyrolysis, catalyst, phenolics
Procedia PDF Downloads 2261676 Bayesian Flexibility Modelling of the Conditional Autoregressive Prior in a Disease Mapping Model
Authors: Davies Obaromi, Qin Yongsong, James Ndege, Azeez Adeboye, Akinwumi Odeyemi
Abstract:
The basic model usually used in disease mapping, is the Besag, York and Mollie (BYM) model and which combines the spatially structured and spatially unstructured priors as random effects. Bayesian Conditional Autoregressive (CAR) model is a disease mapping method that is commonly used for smoothening the relative risk of any disease as used in the Besag, York and Mollie (BYM) model. This model (CAR), which is also usually assigned as a prior to one of the spatial random effects in the BYM model, successfully uses information from adjacent sites to improve estimates for individual sites. To our knowledge, there are some unrealistic or counter-intuitive consequences on the posterior covariance matrix of the CAR prior for the spatial random effects. In the conventional BYM (Besag, York and Mollie) model, the spatially structured and the unstructured random components cannot be seen independently, and which challenges the prior definitions for the hyperparameters of the two random effects. Therefore, the main objective of this study is to construct and utilize an extended Bayesian spatial CAR model for studying tuberculosis patterns in the Eastern Cape Province of South Africa, and then compare for flexibility with some existing CAR models. The results of the study revealed the flexibility and robustness of this alternative extended CAR to the commonly used CAR models by comparison, using the deviance information criteria. The extended Bayesian spatial CAR model is proved to be a useful and robust tool for disease modeling and as a prior for the structured spatial random effects because of the inclusion of an extra hyperparameter.Keywords: Besag2, CAR models, disease mapping, INLA, spatial models
Procedia PDF Downloads 2811675 Proximate Composition, Colour and Sensory Properties of Akara egbe Prepared from Bambara Groundnut (Vigna subterranea)
Authors: Samson A. Oyeyinka, Taiwo Tijani, Adewumi T. Oyeyinka, Mutiat A. Balogun, Fausat L. Kolawole, John K. Joseph
Abstract:
Bambara groundnut is an underutilised leguminous crop that has a similar composition to cowpea. Hence, it could be used in making traditional snack usually produced from cowpea paste. In this study, akara egbe, a traditional snack was prepared from Bambara groundnut flour or paste. Cowpea was included as the reference sample. The proximate composition and functional properties of the flours were studies as well as the proximate composition and sensory properties of the resulting akara egbe. Protein and carbohydrate were the main components of Bambara groundnut and cowpea grains. Ash, fat and fiber contents were low. Bambara groundnut flour had higher protein content (23.71%) than cowpea (19.47%). In terms of functional properties, the oil absorption capacity (0.75 g oil/g flour) of Bambara groundnut flour was significantly (p ≤ 0.05) lower than that of the cowpea (0.92 g oil/g flour), whereas, Cowpea flour absorbed more water (1.59 g water/g flour) than Bambara groundnut flour (1.12 g/g). The packed bulk density (0.92 g/mL) of Bambara groundnut was significantly (p ≤ 0.05) higher than cowpea flour (0.82 g/mL). Akara egbe prepared from Bambara groundnut flour showed significantly (p ≤ 0.05) higher protein content (23.41%) than the sample made from Bambara groundnut paste (19.35%). Akara egbe prepared from cowpea paste had higher ratings in aroma, colour, taste, crunchiness and overall acceptability than those made from cowpea flour or Bambara groundnut paste or flour. Bambara groundnut can produce akara egbe with comparable nutritional and sensory properties to that made from cowpea.Keywords: Bambara groundnut, Cowpea, Snack, Sensory properties
Procedia PDF Downloads 2641674 Barriers to Yoga and Yoga-Based Therapy for Black and Brown Individuals in the United States: Implications for Social Work Practice
Authors: Jessica Gladden
Abstract:
Yoga has been accepted in the majority of communities in the United States as a method of assisting individuals with improving their physical health. Both community yoga classes and yoga-based therapy have been shown to be highly useful for individual’s mental health. Yoga-based therapy has been supported by research to be an evidence-based practice for individuals experiencing anxiety, depression, and disordered eating and for those experiencing post traumatic stress disorder in the wake of trauma. Many individuals who have experienced trauma, as well as other mental health diagnoses, are either very disconnected from their physical bodies or feel unsafe in their bodies. Yoga can be a method of creating safety and control in the body. This is recommended by some of the leading researchers in trauma therapy as a beginning step towards finding safety in the body in order to begin to work on the additional mental health challenges before addressing other long-term challenges. Unfortunately, yoga for physical and mental health is underutilized in black and brown communities despite the research regarding the benefits. Very few studies have examined the barriers to access to yoga for black, brown, and indigenous individuals. This study interviewed 15 yoga practitioners who identified as black or brown and explored the barriers they see in their communities related to accessing yoga and yoga-based services. Several of the themes reported include not feeling welcome, cost of services, time, and cultural/ religious components. Methods for reducing barriers will also be discussed.Keywords: yoga, sport, barrier, black
Procedia PDF Downloads 931673 Cocrystal of Mesalamine for Enhancement of Its Biopharmaceutical Properties, Utilizing Supramolecular Chemistry Approach
Authors: Akshita Jindal, Renu Chadha, Maninder Karan
Abstract:
Supramolecular chemistry has gained recent eminence in a flurry of research documents demonstrating the formation of new crystalline forms with potentially advantageous characteristics. Mesalamine (5-amino salicylic acid) belongs to anti-inflammatory class of drugs, is used to treat ulcerative colitis and Crohn’s disease. Unfortunately, mesalamine suffer from poor solubility and therefore very low bioavailability. This work is focused on preparation and characterization of cocrystal of mesalamine with nicotinamide (MNIC) a coformer of GRAS status. Cocrystallisation was achieved by solvent drop grinding in stoichiometric ratio of 1:1 using acetonitrile as solvent and was characterized by various techniques including DSC (Differential Scanning Calorimetry), PXRD (X-ray Powder Diffraction), and FTIR (Fourier Transform Infrared Spectrometer). The co-crystal depicted single endothermic transitions (254°C) which were different from the melting peaks of both drug (288°C) and coformer (128°C) indicating the formation of a new solid phase. Different XRPD patterns and FTIR spectrums for the co-crystals from those of individual components confirms the formation of new phase. Enhancement in apparent solubility study and intrinsic dissolution study showed effectiveness of this cocrystal. Further improvement in pharmacokinetic profile has also been observed with 2 folds increase in bioavailability. To conclude, our results show that application of nicotinamide as a coformer is a viable approach towards the preparation of cocrystals of potential drug molecule having limited solubility.Keywords: cocrystal, mesalamine, nicotinamide, solvent drop grinding
Procedia PDF Downloads 1771672 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection
Authors: S. Shankar Bharathi
Abstract:
Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision
Procedia PDF Downloads 4281671 Cleaning Performance of High-Frequency, High-Intensity 360 kHz Frequency Operating in Thickness Mode Transducers
Authors: R. Vetrimurugan, Terry Lim, M. J. Goodson, R. Nagarajan
Abstract:
This study investigates the cleaning performance of high intensity 360 kHz frequency on the removal of nano-dimensional and sub-micron particles from various surfaces, uniformity of the cleaning tank and run to run variation of cleaning process. The uniformity of the cleaning tank was measured by two different methods i.e 1. ppbTM meter and 2. Liquid Particle Counting (LPC) technique. In the second method, aluminium metal spacer components was placed at various locations of the cleaning tank (such as centre, top left corner, bottom left corner, top right corner, bottom right corner) and the resultant particles removed by 360 kHz frequency was measured. The result indicates that the energy was distributed more uniformly throughout the entire cleaning vessel even at the corners and edges of the tank when megasonic sweeping technology is applied. The result also shows that rinsing the parts with 360 kHz frequency at final rinse gives lower particle counts, hence higher cleaning efficiency as compared to other frequencies. When megasonic sweeping technology is applied each piezoelectric transducers will operate at their optimum resonant frequency and generates stronger acoustic cavitational force and higher acoustic streaming velocity. These combined forces are helping to enhance the particle removal and at the same time improve the overall cleaning performance. The multiple extractions study was also carried out for various frequencies to measure the cleaning potential and asymptote value.Keywords: power distribution, megasonic sweeping, cavitation intensity, particle removal, laser particle counting, nano, submicron
Procedia PDF Downloads 418