Search results for: cadaveric estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1931

Search results for: cadaveric estimation

731 Phytochemial Screening, Anti-Microbial and Mineral Determination of Brysocarpus coccineus Root

Authors: I. L. Ibrahim, A. Mann, A. Ndanaimi

Abstract:

The research involved phytochemical screening, antibacterial activities and mineral determination by flame photometry of the crude extract of Brysocarpus coccineus schum indeed were carried out. The result of Phytochemical screening reveal tha saponins, alkaloids, cardiac glycosides, and anthraquinones were present. This suggests that the plant extract could be used as anti-inflammatory and anti-bleeding agents. Estimation of mineral content shows that the crude extract of B. coccineus contains 0.73 (Na+), 1.06 (K+) and 1.98 (Ca+) which justifies its use to be safe for hypertensive patients and could be used to lower blood pressure. The antibacterial properties of aqueous and ethanol extract were studied against some bacteria; pseudomonas aeruginosa, Escherichia coli, Bacilus subtilis, Klebsilla penmuoniae by disc diffusion method. The aqueous extract showed significant activity against the organisms while the ethanol at concentrations 5-10mg/ml ethanol extract showed significant zone of inhibition against the organisms, E. coli, (19 mm), B. cereus (12 mm), P. aeruginosa (11 mm), K. pnemuoniae (11 mm). Minimum inhibitory concentration (MIC) was carried with considerable effect of inhibition on the organisms. The MIC values observed were 1, 24, 16 and 19 mm against E. coli, B. cereus, P. aeruginosa and K. pnemuoniae respectively. Therefore, the plant could be a potential source of antibacterial agent although more pharmacological and clinical study may be recommended.

Keywords: phytochemicals, microorganisms, screenings, mineral ions

Procedia PDF Downloads 412
730 Developing Critical-Process Skills Integrated Assessment Instrument as Alternative Assessment on Electrolyte Solution Matter in Senior High School

Authors: Sri Rejeki Dwi Astuti, Suyanta

Abstract:

The demanding of the asessment in learning process was impact by policy changes. Nowadays, the assessment not only emphasizes knowledge, but also skills and attitude. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop instrument of integrated assessment as alternative assessment to measure critical thinking skills and science process skills in electrolyte solution and to describe instrument’s characteristic such as logic validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step and was analyzed by qualitative analysis. Initial product was observed by three peer reviewer and six expert judgment (two subject matter expert, two evaluation expert and two chemistry teacher) to acquire logic validity test. Logic validity test was analyzed using Aiken’s formula. The estimation of construct validity was analyzed by exploratory factor analysis. Result showed that integrated assessment instrument has 0,90 of Aiken’s Value and all item in integrated assessment asserted valid according to construct validity.

Keywords: construct validity, critical thinking skills, integrated assessment instrument, logic validity, science process skills

Procedia PDF Downloads 262
729 Plasma Treatment of a Lignite Using Water-Stabilized Plasma Torch at Atmospheric Pressure

Authors: Anton Serov, Alan Maslani, Michal Hlina, Vladimir Kopecky, Milan Hrabovsky

Abstract:

Recycling of organic waste is an increasingly hot topic in recent years. This issue becomes even more interesting if the raw material for the fuel production can be obtained as the result of that recycling. A process of high-temperature decomposition of a lignite (a non-hydrolysable complex organic compound) was studied on the plasma gasification reactor PLASGAS, where water-stabilized plasma torch was used as a source of high enthalpy plasma. The plasma torch power was 120 kW and allowed heating of the reactor to more than 1000 °C. The material feeding rate in the gasification reactor was selected 30 and 60 kg per hour that could be compared with small industrial production. An efficiency estimation of the thermal decomposition process was done. A balance of the torch energy distribution was studied as well as an influence of the lignite particle size and an addition of methane (CH4) in a reaction volume on the syngas composition (H2+CO). It was found that the ratio H2:CO had values in the range of 1,5 to 2,5 depending on the experimental conditions. The recycling process occurred at atmospheric pressure that was one of the important benefits because of the lack of expensive vacuum pump systems. The work was supported by the Grant Agency of the Czech Republic under the project GA15-19444S.

Keywords: atmospheric pressure, lignite, plasma treatment, water-stabilized plasma torch

Procedia PDF Downloads 372
728 Applying of an Adaptive Neuro-Fuzzy Inference System (ANFIS) for Estimation of Flood Hydrographs

Authors: Amir Ahmad Dehghani, Morteza Nabizadeh

Abstract:

This paper presents the application of an Adaptive Neuro-Fuzzy Inference System (ANFIS) to flood hydrograph modeling of Shahid Rajaee reservoir dam located in Iran. This was carried out using 11 flood hydrographs recorded in Tajan river gauging station. From this dataset, 9 flood hydrographs were chosen to train the model and 2 flood hydrographs to test the model. The different architectures of neuro-fuzzy model according to the membership function and learning algorithm were designed and trained with different epochs. The results were evaluated in comparison with the observed hydrographs and the best structure of model was chosen according the least RMSE in each performance. To evaluate the efficiency of neuro-fuzzy model, various statistical indices such as Nash-Sutcliff and flood peak discharge error criteria were calculated. In this simulation, the coordinates of a flood hydrograph including peak discharge were estimated using the discharge values occurred in the earlier time steps as input values to the neuro-fuzzy model. These results indicate the satisfactory efficiency of neuro-fuzzy model for flood simulating. This performance of the model demonstrates the suitability of the implemented approach to flood management projects.

Keywords: adaptive neuro-fuzzy inference system, flood hydrograph, hybrid learning algorithm, Shahid Rajaee reservoir dam

Procedia PDF Downloads 477
727 Estimation of Optimum Parameters of Non-Linear Muskingum Model of Routing Using Imperialist Competition Algorithm (ICA)

Authors: Davood Rajabi, Mojgan Yazdani

Abstract:

Non-linear Muskingum model is an efficient method for flood routing, however, the efficiency of this method is influenced by three applied parameters. Therefore, efficiency assessment of Imperialist Competition Algorithm (ICA) to evaluate optimum parameters of non-linear Muskingum model was addressed through this study. In addition to ICA, Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) were also used aiming at an available criterion to verdict ICA. In this regard, ICA was applied for Wilson flood routing; then, routing of two flood events of DoAab Samsami River was investigated. In case of Wilson flood that the target function was considered as the sum of squared deviation (SSQ) of observed and calculated discharges. Routing two other floods, in addition to SSQ, another target function was also considered as the sum of absolute deviations of observed and calculated discharge. For the first floodwater based on SSQ, GA indicated the best performance, however, ICA was on first place, based on SAD. For the second floodwater, based on both target functions, ICA indicated a better operation. According to the obtained results, it can be said that ICA could be used as an appropriate method to evaluate the parameters of Muskingum non-linear model.

Keywords: Doab Samsami river, genetic algorithm, imperialist competition algorithm, meta-exploratory algorithms, particle swarm optimization, Wilson flood

Procedia PDF Downloads 502
726 Does Trade and Institutional Quality Play Any Significant Role on Environmental Quality in Sub-Saharan Africa?

Authors: Luqman Afolabi

Abstract:

This paper measures the impacts of trade and institutions on environmental quality in Sub-Saharan Africa (SSA). To examine the direction and the magnitude of the effects, the study employs the pooled mean group (PMG) estimation technique on the panel data obtained from the World Bank’s World Development and Governance Indicators, between 1996 and 2018. The empirical estimates validate the environmental Kuznets curve hypothesis (EKC) for the region, even though there have been inconclusive results on the environment – growth nexus. Similarly, a positive coefficient is obtained on the impact of trade on the environment, while the impact of the institutional indicators produce mixed results. A significant policy implication is that the governments of the SSA countries pursue policies that tend to increase economic growth, so that pollutants may be reduced. Such policies may include the provision of incentives for sustainable growth-driven industries in the region. In addition, the governance infrastructures should be improved in such a way that appropriate penalties are imposed on the pollutants, while advanced technologies that have the potentials to reduce environmental degradation should be encouraged. Finally, it is imperative from these findings that the governments of the region should promote their trade relations and the competitiveness of their local industries in order to keep pace with the global markets.

Keywords: environmental quality, institutional quality sustainable development goals, trade

Procedia PDF Downloads 141
725 Oxidative Stress Markers in Sports Related to Training

Authors: V. Antevska, B. Dejanova, L. Todorovska, J. Pluncevic, E. Sivevska, S. Petrovska, S. Mancevska, I. Karagjozova

Abstract:

Introduction: The aim of this study was to optimise the laboratory oxidative stress (OS) markers in soccer players. Material and methods: In a number of 37 soccer players (21±3 years old) and 25 control subjects (sedenters), plasma samples were taken for d-ROMs (reactive oxygen metabolites) and NO (nitric oxide) determination. The d-ROMs test was performed by measurement of hydroperoxide levels (Diacron, Italy). For NO determination the method of nitrate enzyme reduction with the Greiss reagent was used (OXIS, USA). The parameters were taken after the training of the soccer players and were compared with the control group. Training was considered as maximal exercise treadmill test. The criteria of maximum loading for each subject was established as >95% maximal heart rate. Results: The level of d-ROMs was found to be increased in the soccer players vs. control group but no significant difference was noticed. After the training d-ROMs in soccer players showed increased value of 299±44 UCarr (p<0.05). NO showed increased level in all soccer players vs. controls but significant difference was found after the training 102±29 μmol (p<0.05). Conclusion: Due to these results we may suggest that the measuring these OS markers in sport medicine may be useful for better estimation and evaluation of the training program. More oxidative stress should be used to clarify optimization of the training intensity program.

Keywords: oxidative stress markers, soccer players, training, sport

Procedia PDF Downloads 446
724 Channel Sounding and PAPR Reduction in OFDM for WiMAX Using Software Defined Radio

Authors: B. Siva Kumar Reddy, B. Lakshmi

Abstract:

WiMAX is a high speed broadband wireless access technology that adopted OFDM/OFDMA techniques to supply higher data rates with high spectral efficiency. However, OFDM suffers in view of high Peak to Average Power Ratio (PAPR) and high affect to synchronization errors. In this paper, the high PAPR problem is solved by using phase modulation to get Constant Envelop Orthogonal Frequency Division Multiplexing (CE-OFDM). The synchronization failures are brought down by employing a frequency lock loop, Poly phase clock synchronizer, Costas loop and blind equalizers such as Constant Modulus Algorithm (CMA) equalizer and Sign Kurtosis Maximization Adaptive Algorithm (SKMAA) equalizers. The WiMAX physical layer is executed on Software Defined Radio (SDR) prototype by utilizing USRP N210 as hardware and GNU Radio as software plat-forms. A SNR estimation is performed on the signal received through USRP N210. To empathize wireless propagation in specific environments, a sliding correlator wireless channel sounding system is designed by using SDR testbed.

Keywords: BER, CMA equalizer, Kurtosis equalizer, GNU Radio, OFDM/OFDMA, USRP N210

Procedia PDF Downloads 347
723 Whole Coding Genome Inter-Clade Comparison to Predict Global Cancer-Protecting Variants

Authors: Lamis Naddaf, Yuval Tabach

Abstract:

In this research, we identified the missense genetic variants that have the potential to enhance resistance against cancer. Such field has not been widely explored, as researchers tend to investigate mutations that cause diseases, in response to the suffering of patients, rather than those mutations that protect from them. In conjunction with the genomic revolution, and the advances in genetic engineering and synthetic biology, identifying the protective variants will increase the power of genotype-phenotype predictions and can have significant implications on improved risk estimation, diagnostics, prognosis and even for personalized therapy and drug discovery. To approach our goal, we systematically investigated the sites of the coding genomes and picked up the alleles that showed a correlation with the species’ cancer resistance. We predicted 250 protecting variants (PVs) with a 0.01 false discovery rate and more than 20 thousand PVs with a 0.25 false discovery rate. Cancer resistance in Mammals and reptiles was significantly predicted by the number of PVs a species has. Moreover, Genes enriched with the protecting variants are enriched in pathways relevant to tumor suppression like pathways of Hedgehog signaling and silencing, which its improper activation is associated with the most common form of cancer malignancy. We also showed that the PVs are more abundant in healthy people compared to cancer patients within different human races.

Keywords: comparative genomics, machine learning, cancer resistance, cancer-protecting alleles

Procedia PDF Downloads 95
722 Modeling of Diurnal Pattern of Air Temperature in a Tropical Environment: Ile-Ife and Ibadan, Nigeria

Authors: Rufus Temidayo Akinnubi, M. O. Adeniyi

Abstract:

Existing diurnal air temperature models simulate night time air temperature over Nigeria with high biases. An improved parameterization is presented for modeling the diurnal pattern of air temperature (Ta) which is applicable in the calculation of turbulent heat fluxes in Global climate models, based on Nigeria Micrometeorological Experimental site (NIMEX) surface layer observations. Five diurnal Ta models for estimating hourly Ta from daily maximum, daily minimum, and daily mean air temperature were validated using root-mean-square error (RMSE), Mean Error Bias (MBE) and scatter graphs. The original Fourier series model showed better performance for unstable air temperature parameterizations while the stable Ta was strongly overestimated with a large error. The model was improved with the inclusion of the atmospheric cooling rate that accounts for the temperature inversion that occurs during the nocturnal boundary layer condition. The MBE and RMSE estimated by the modified Fourier series model reduced by 4.45 oC and 3.12 oC during the transitional period from dry to wet stable atmospheric conditions. The modified Fourier series model gave good estimation of the diurnal weather patterns of Ta when compared with other existing models for a tropical environment.

Keywords: air temperature, mean bias error, Fourier series analysis, surface energy balance,

Procedia PDF Downloads 228
721 Increasing Business Competitiveness in Georgia in Terms of Globalization

Authors: Badri Gechbaia, Levan Gvarishvili

Abstract:

Despite the fact that a lot of Georgian scientists have worked on the issue of the business competitiveness, it think that it is necessary to deepen the works in this sphere, it is necessary also to perfect the methodology in the estimation of the business competitiveness, we have to display the main factors which define the competitive advantages in the business sphere, we have also to establish the interconnections between the business competitiveness level and the quality of states economical involvement in the international economic processes, we have to define the ways to rise the business competitiveness and its role in the upgrading of countries economic development. The introduction part justifies the actuality of the studied topic and the thesis; It defines the survey subject, the object, and the goals with relevant objectives; theoretical-methodological and informational-statistical base for the survey; what is new in the survey and what the value for its theoretical and practical application is. The aforementioned study is an effort to raise public awareness on this issue. Analysis of the fundamental conditions for the efficient functioning of business in Georgia, identification of reserves for increasing its efficiency based on the assessment of the strengths and weaknesses of the business sector. Methods of system analysis, abstract-logic, induction and deduction, synthesis and generalization, and positive, normative, and comparative analysis are used in the research process. Specific regularities of the impact of the globalization process on the determinants of business competitiveness are established. The reasons for business competitiveness in Georgia have been identified

Keywords: competitiveness, methodology, georgian, economic

Procedia PDF Downloads 112
720 Different Sampling Schemes for Semi-Parametric Frailty Model

Authors: Nursel Koyuncu, Nihal Ata Tutkun

Abstract:

Frailty model is a survival model that takes into account the unobserved heterogeneity for exploring the relationship between the survival of an individual and several covariates. In the recent years, proposed survival models become more complex and this feature causes convergence problems especially in large data sets. Therefore selection of sample from these big data sets is very important for estimation of parameters. In sampling literature, some authors have defined new sampling schemes to predict the parameters correctly. For this aim, we try to see the effect of sampling design in semi-parametric frailty model. We conducted a simulation study in R programme to estimate the parameters of semi-parametric frailty model for different sample sizes, censoring rates under classical simple random sampling and ranked set sampling schemes. In the simulation study, we used data set recording 17260 male Civil Servants aged 40–64 years with complete 10-year follow-up as population. Time to death from coronary heart disease is treated as a survival-time and age, systolic blood pressure are used as covariates. We select the 1000 samples from population using different sampling schemes and estimate the parameters. From the simulation study, we concluded that ranked set sampling design performs better than simple random sampling for each scenario.

Keywords: frailty model, ranked set sampling, efficiency, simple random sampling

Procedia PDF Downloads 209
719 Development and Validation of a HPLC Method for 6-Gingerol and 6-Shogaol in Joint Pain Relief Gel Containing Ginger (Zingiber officinale)

Authors: Tanwarat Kajsongkram, Saowalux Rotamporn, Sirinat Limbunruang, Sirinan Thubthimthed.

Abstract:

High-Performance Liquid Chromatography (HPLC) method was developed and validated for simultaneous estimation of 6-Gingerol(6G) and 6-Shogaol(6S) in joint pain relief gel containing ginger extract. The chromatographic separation was achieved by using C18 column, 150 x 4.6mm i.d., 5μ Luna, mobile phase containing acetonitrile and water (gradient elution). The flow rate was 1.0 ml/min and the absorbance was monitored at 282 nm. The proposed method was validated in terms of the analytical parameters such as specificity, accuracy, precision, linearity, range, limit of detection (LOD), limit of quantification (LOQ), and determined based on the International Conference on Harmonization (ICH) guidelines. The linearity ranges of 6G and 6S were obtained over 20-60 and 6-18 µg/ml respectively. Good linearity was observed over the above-mentioned range with linear regression equation Y= 11016x- 23778 for 6G and Y = 19276x-19604 for 6S (x is concentration of analytes in μg/ml and Y is peak area). The value of correlation coefficient was found to be 0.9994 for both markers. The limit of detection (LOD) and limit of quantification (LOQ) for 6G were 0.8567 and 2.8555 µg/ml and for 6S were 0.3672 and 1.2238 µg/ml respectively. The recovery range for 6G and 6S were found to be 91.57 to 102.36 % and 84.73 to 92.85 % for all three spiked levels. The RSD values from repeated extractions for 6G and 6S were 3.43 and 3.09% respectively. The validation of developed method on precision, accuracy, specificity, linearity, and range were also performed with well-accepted results.

Keywords: ginger, 6-gingerol, HPLC, 6-shogaol

Procedia PDF Downloads 440
718 Studying the Effects of Economic and Financial Development as Well as Institutional Quality on Environmental Destruction in the Upper-Middle Income Countries

Authors: Morteza Raei Dehaghi, Seyed Mohammad Mirhashemi

Abstract:

The current study explored the effect of economic development, financial development and institutional quality on environmental destruction in upper-middle income countries during the time period of 1999-2011. The dependent variable is logarithm of carbon dioxide emissions that can be considered as an index for destruction or quality of the environment given to its effects on the environment. Financial development and institutional development variables as well as some control variables were considered. In order to study cross-sectional correlation among the countries under study, Pesaran and Friz test was used. Since the results of both tests show cross-sectional correlation in the countries under study, seemingly unrelated regression method was utilized for model estimation. The results disclosed that Kuznets’ environmental curve hypothesis is confirmed in upper-middle income countries and also, financial development and institutional quality have a significant effect on environmental quality. The results of this study can be considered by policy makers in countries with different income groups to have access to a growth accompanied by improved environmental quality.

Keywords: economic development, environmental destruction, financial development, institutional development, seemingly unrelated regression

Procedia PDF Downloads 345
717 Estimation of Probabilistic Fatigue Crack Propagation Models of AZ31 Magnesium Alloys under Various Load Ratio Conditions by Using the Interpolation of a Random Variable

Authors: Seon Soon Choi

Abstract:

The essential purpose is to present the good fatigue crack propagation model describing a stochastic fatigue crack growth behavior in a rolled magnesium alloy, AZ31, under various load ratio conditions. Fatigue crack propagation experiments were carried out in laboratory air under four conditions of load ratio, R, using AZ31 to investigate the crack growth behavior. The stochastic fatigue crack growth behavior was analyzed using an interpolation of random variable, Z, introduced to an empirical fatigue crack propagation model. The empirical fatigue models used in this study are Paris-Erdogan model, Walker model, Forman model, and modified Forman model. It was found that the random variable is useful in describing the stochastic fatigue crack growth behaviors under various load ratio conditions. The good probabilistic model describing a stochastic fatigue crack growth behavior under various load ratio conditions was also proposed.

Keywords: magnesium alloys, fatigue crack propagation model, load ratio, interpolation of random variable

Procedia PDF Downloads 409
716 Use of Gaussian-Euclidean Hybrid Function Based Artificial Immune System for Breast Cancer Diagnosis

Authors: Cuneyt Yucelbas, Seral Ozsen, Sule Yucelbas, Gulay Tezel

Abstract:

Due to the fact that there exist only a small number of complex systems in artificial immune system (AIS) that work out nonlinear problems, nonlinear AIS approaches, among the well-known solution techniques, need to be developed. Gaussian function is usually used as similarity estimation in classification problems and pattern recognition. In this study, diagnosis of breast cancer, the second type of the most widespread cancer in women, was performed with different distance calculation functions that euclidean, gaussian and gaussian-euclidean hybrid function in the clonal selection model of classical AIS on Wisconsin Breast Cancer Dataset (WBCD), which was taken from the University of California, Irvine Machine-Learning Repository. We used 3-fold cross validation method to train and test the dataset. According to the results, the maximum test classification accuracy was reported as 97.35% by using of gaussian-euclidean hybrid function for fold-3. Also, mean of test classification accuracies for all of functions were obtained as 94.78%, 94.45% and 95.31% with use of euclidean, gaussian and gaussian-euclidean, respectively. With these results, gaussian-euclidean hybrid function seems to be a potential distance calculation method, and it may be considered as an alternative distance calculation method for hard nonlinear classification problems.

Keywords: artificial immune system, breast cancer diagnosis, Euclidean function, Gaussian function

Procedia PDF Downloads 433
715 Tool Wear Monitoring of High Speed Milling Based on Vibratory Signal Processing

Authors: Hadjadj Abdechafik, Kious Mecheri, Ameur Aissa

Abstract:

The objective of this study is to develop a process of treatment of the vibratory signals generated during a horizontal high speed milling process without applying any coolant in order to establish a monitoring system able to improve the machining performance. Thus, many tests were carried out on the horizontal high speed centre (PCI Météor 10), in given cutting conditions, by using a milling cutter with only one insert and measured its frontal wear from its new state that is considered as a reference state until a worn state that is considered as unsuitable for the tool to be used. The results obtained show that the first harmonic follow well the evolution of frontal wear, on another hand a wavelet transform is used for signal processing and is found to be useful for observing the evolution of the wavelet approximations through the cutting tool life. The power and the Root Mean Square (RMS) values of the wavelet transformed signal gave the best results and can be used for tool wear estimation. All this features can constitute the suitable indicators for an effective detection of tool wear and then used for the input parameters of an online monitoring system. Although we noted the remarkable influence of the machining cycle on the quality of measurements by the introduction of a bias on the signal, this phenomenon appears in particular in horizontal milling and in the majority of studies is ignored.

Keywords: flank wear, vibration, milling, signal processing, monitoring

Procedia PDF Downloads 596
714 IPO Valuation and Profitability Expectations: Evidence from the Italian Exchange

Authors: Matteo Bonaventura, Giancarlo Giudici

Abstract:

This paper analyses the valuation process of companies listed on the Italian Exchange in the period 2000-2009 at their Initial Public Offering (IPO). One the most common valuation techniques declared in the IPO prospectus to determine the offer price is the Discounted Cash Flow (DCF) method. We develop a ‘reverse engineering’ model to discover the short term profitability implied in the offer prices. We show that there is a significant optimistic bias in the estimation of future profitability compared to ex-post actual realization and the mean forecast error is substantially large. Yet we show that such error characterizes also the estimations carried out by analysts evaluating non-IPO companies. The forecast error is larger the faster has been the recent growth of the company, the higher is the leverage of the IPO firm, the more companies issued equity on the market. IPO companies generally exhibit better operating performance before the listing, with respect to comparable listed companies, while after the flotation they do not perform significantly different in term of return on invested capital. Pre-IPO book building activity plays a significant role in partially reducing the forecast error and revising expectations, while the market price of the first day of trading does not contain information for further reducing forecast errors.

Keywords: initial public offerings, DCF, book building, post-IPO profitability drop

Procedia PDF Downloads 351
713 Jamun Juice Extraction Using Commercial Enzymes and Optimization of the Treatment with the Help of Physicochemical, Nutritional and Sensory Properties

Authors: Payel Ghosh, Rama Chandra Pradhan, Sabyasachi Mishra

Abstract:

Jamun (Syzygium cuminii L.) is one of the important indigenous minor fruit with high medicinal value. The jamun cultivation is unorganized and there is huge loss of this fruit every year. The perishable nature of the fruit makes its postharvest management further difficult. Due to the strong cell wall structure of pectin-protein bonds and hard seeds, extraction of juice becomes difficult. Enzymatic treatment has been commercially used for improvement of juice quality with high yield. The objective of the study was to optimize the best treatment method for juice extraction. Enzymes (Pectinase and Tannase) from different stains had been used and for each enzyme, best result obtained by using response surface methodology. Optimization had been done on the basis of physicochemical property, nutritional property, sensory quality and cost estimation. According to quality aspect, cost analysis and sensory evaluation, the optimizing enzymatic treatment was obtained by Pectinase from Aspergillus aculeatus strain. The optimum condition for the treatment was 44 oC with 80 minute with a concentration of 0.05% (w/w). At these conditions, 75% of yield with turbidity of 32.21NTU, clarity of 74.39%T, polyphenol content of 115.31 mg GAE/g, protein content of 102.43 mg/g have been obtained with a significant difference in overall acceptability.

Keywords: enzymatic treatment, Jamun, optimization, physicochemical property, sensory analysis

Procedia PDF Downloads 295
712 Modelling Home Appliances for Energy Management System: Comparison of Simulation Results with Measurements

Authors: Aulon Shabani, Denis Panxhi, Orion Zavalani

Abstract:

This paper presents the modelling and development of a simulator for residential electrical appliances. The simulator is developed on MATLAB providing the possibility to analyze and simulate energy consumption of frequently used home appliances in Albania. Modelling of devices considers the impact of different factors, mentioning occupant behavior and climacteric conditions. Most devices are modeled as an electric circuit, and the electric energy consumption is estimated by the solutions of the guiding differential equations. The provided models refer to devices like a dishwasher, oven, water heater, air conditioners, light bulbs, television, refrigerator water, and pump. The proposed model allows us to simulate beforehand the energetic behavior of the largest consumption home devices to estimate peak consumption and improving its reduction. Simulated home prototype results are compared to real measurement of a considered typical home. Obtained results from simulator framework compared to monitored typical household using EmonTxV3 show the effectiveness of the proposed simulation. This conclusion will help for future simulation of a large group of typical household for a better understanding of peak consumption.

Keywords: electrical appliances, energy management, modelling, peak estimation, simulation, smart home

Procedia PDF Downloads 161
711 Whole Coding Genome Inter-Clade Comparisons to Predict Global Cancer-Protecting Variants

Authors: Lamis Naddaf, Yuval Tabach

Abstract:

We identified missense genetic variants with the potential to enhance resistance against cancer. Such a field has not been widely explored as researchers tend to investigate the mutations that cause diseases, in response to the suffering of patients, rather than those mutations that protect from them. In conjunction with the genomic revolution and the advances in genetic engineering and synthetic biology, identifying the protective variants will increase the power of genotype-phenotype predictions and have significant implications for improved risk estimation, diagnostics, prognosis, and even personalized therapy and drug discovery. To approach our goal, we systematically investigated the sites of the coding genomes and selected the alleles that showed a correlation with the species’ cancer resistance. Interestingly, we found several amino acids that are more generally preferred (like the Proline) or avoided (like the Cysteine) by the resistant species. Furthermore, Cancer resistance in mammals and reptiles is significantly predicted by the number of the predicted protecting variants (PVs) a species has. Moreover, PVs-enriched-genes are enriched in pathways relevant to tumor suppression. For example, they are enriched in the Hedgehog signaling and silencing pathways, which its improper activation is associated with the most common form of cancer malignancy. We also showed that the PVs are mostly more abundant in healthy people compared to cancer patients within different human races.

Keywords: cancer resistance, protecting variant, naked mole rat, comparative genomics

Procedia PDF Downloads 109
710 Estimation of Lungs Physiological Motion for Patient Undergoing External Lung Irradiation

Authors: Yousif Mohamed Y. Abdallah

Abstract:

This is an experimental study deals with detection, measurement and analysis of the periodic physiological organ motion during external beam radiotherapy; to improve the accuracy of the radiation field placement, and to reduce the exposure of healthy tissue during radiation treatments. The importance of this study is to detect the maximum path of the mobile structures during radiotherapy delivery, to define the planning target volume (PTV) and irradiated volume during both inspiration and expiration period and to verify the target volume. In addition to its role to highlight the importance of the application of Intense Guided Radiotherapy (IGRT) methods in the field of radiotherapy. The results showed (body contour was equally (3.17 + 0.23 mm), for left lung displacement reading (2.56 + 0.99 mm) and right lung is (2.42 + 0.77 mm) which the radiation oncologist to take suitable countermeasures in case of significant errors. In addition, the use of the image registration technique for automatic position control is predicted potential motion. The motion ranged between 2.13 mm and 12.2 mm (low and high). In conclusion, individualized assessment of tumor mobility can improve the accuracy of target areas definition in patients undergo Sterostatic RT for stage I, II and III lung cancer (NSCLC). Definition of the target volume based on a single CT scan with a margin of 10 mm is clearly inappropriate.

Keywords: respiratory motion, external beam radiotherapy, image processing, lung

Procedia PDF Downloads 533
709 Factors Affecting the Profitability of Commercial Banks: An Empirical Study of Indian Banking Sector

Authors: Neeraj Gupta, Jitendra Mahakud

Abstract:

The banking system plays a major role in the Indian economy. Banking system is the payment gateway of most of the financial transactions. Banking has gone a major transition that is still in progress. Recent banking reforms after liberalization in 1991 have led to the establishment of the foreign banks in the country. The foreign banks are not listed in the Indian stock markets and have increased the competition leading to the capture of the significant share in the revenue from the public sector banks which are still the major players in the Indian banking sector. The performance of the banking sector depends on the internal (bank specific) as well as the external (market specific and macroeconomic) factors. Profitability in banking sector is affected by numerous factors which can be internal or external. The present study examines these internal and external factors which are likely to effect the profitablilty of the Indian banks. The sample consists of a panel dataset of 64 commercial banks in India, consisting of 1088 observations over the years from 1998 to 2016. The GMM dynamic panel estimation given by Arellano and Bond has been used. The study revealed that the variables capital adequacy ratio, deposit, age, labour productivity, non-performing asset, inflation and concentration have significant effect on performance measured.

Keywords: banks in India, bank performance, bank productivity, banking management

Procedia PDF Downloads 271
708 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand

Authors: Esma Birisci, Ronald McGarvey

Abstract:

One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.

Keywords: environmental studies, food waste, production planning, uncertain and correlated demand

Procedia PDF Downloads 372
707 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data

Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini

Abstract:

A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.

Keywords: central Italy, extreme events, rainfall data, underestimation errors

Procedia PDF Downloads 189
706 Qualitative and Quantitative Traits of Processed Farmed Fish in N. W. Greece

Authors: Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou

Abstract:

The filleting yield and the chemical composition of farmed sea bass (Dicentrarchus labrax); rainbow trout (Oncorynchus mykiss) and meagre (Argyrosomus regius) was investigated in farmed fish in NW Greece. The results provide an estimate of the quantity of fish required to produce one kilogram of fillet weight, an estimation which is required for the operational management of fish processing companies. Furthermore in this work, the ratio of feed input required to produce one kilogram of fish fillet (FFCR) is presented for the first time as a useful indicator of the ecological footprint of consuming farmed fish. The lowest lipid content appeared in meagre (1,7%) and the highest in trout (4,91%). The lowest fillet yield and fillet yield feed conversion ratio (FYFCR) was in meagre (FY=42,17%, FFCR=2,48), the best fillet yield (FY=53,8%) and FYFCR (2,10) was exhibited in farmed rainbow trout. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.

Keywords: farmed fish, flesh quality, filleting yield, lipid

Procedia PDF Downloads 308
705 An Approximate Formula for Calculating the Fundamental Mode Period of Vibration of Practical Building

Authors: Abdul Hakim Chikho

Abstract:

Most international codes allow the use of an equivalent lateral load method for designing practical buildings to withstand earthquake actions. This method requires calculating an approximation to the fundamental mode period of vibrations of these buildings. Several empirical equations have been suggested to calculate approximations to the fundamental periods of different types of structures. Most of these equations are knowing to provide an only crude approximation to the required fundamental periods and repeating the calculation utilizing a more accurate formula is usually required. In this paper, a new formula to calculate a satisfactory approximation of the fundamental period of a practical building is proposed. This formula takes into account the mass and the stiffness of the building therefore, it is more logical than the conventional empirical equations. In order to verify the accuracy of the proposed formula, several examples have been solved. In these examples, calculating the fundamental mode periods of several farmed buildings utilizing the proposed formula and the conventional empirical equations has been accomplished. Comparing the obtained results with those obtained from a dynamic computer has shown that the proposed formula provides a more accurate estimation of the fundamental periods of practical buildings. Since the proposed method is still simple to use and requires only a minimum computing effort, it is believed to be ideally suited for design purposes.

Keywords: earthquake, fundamental mode period, design, building

Procedia PDF Downloads 281
704 Verification of Simulated Accumulated Precipitation

Authors: Nato Kutaladze, George Mikuchadze, Giorgi Sokhadze

Abstract:

Precipitation forecasts are one of the most demanding applications in numerical weather prediction (NWP). Georgia, as the whole Caucasian region, is characterized by very complex topography. The country territory is prone to flash floods and mudflows, quantitative precipitation estimation (QPE) and quantitative precipitation forecast (QPF) at any leading time are very important for Georgia. In this study, advanced research weather forecasting model’s skill in QPF is investigated over Georgia’s territory. We have analyzed several convection parameterization and microphysical scheme combinations for different rainy episodes and heavy rainy phenomena. We estimate errors and biases in accumulated 6 h precipitation using different spatial resolution during model performance verification for 12-hour and 24-hour lead time against corresponding rain gouge observations and satellite data. Various statistical parameters have been calculated for the 8-month comparison period, and some skills of model simulation have been evaluated. Our focus is on the formation and organization of convective precipitation systems in a low-mountain region. Several problems in connection with QPF have been identified for mountain regions, which include the overestimation and underestimation of precipitation on the windward and lee side of the mountains, respectively, and a phase error in the diurnal cycle of precipitation leading to the onset of convective precipitation in model forecasts several hours too early.

Keywords: extremal dependence index, false alarm, numerical weather prediction, quantitative precipitation forecasting

Procedia PDF Downloads 146
703 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution

Authors: Saleem Z. Ramadan

Abstract:

This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the PTH percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.

Keywords: reliability, accelerated life testing, cumulative exposure model, Bayesian estimation, progressive type-I censoring, Weibull distribution

Procedia PDF Downloads 503
702 Compression Index Estimation by Water Content and Liquid Limit and Void Ratio Using Statistics Method

Authors: Lizhou Chen, Abdelhamid Belgaid, Assem Elsayed, Xiaoming Yang

Abstract:

Compression index is essential in foundation settlement calculation. The traditional method for determining compression index is consolidation test which is expensive and time consuming. Many researchers have used regression methods to develop empirical equations for predicting compression index from soil properties. Based on a large number of compression index data collected from consolidation tests, the accuracy of some popularly empirical equations were assessed. It was found that primary compression index is significantly overestimated in some equations while it is underestimated in others. The sensitivity analyses of soil parameters including water content, liquid limit and void ratio were performed. The results indicate that the compression index obtained from void ratio is most accurate. The ANOVA (analysis of variance) demonstrates that the equations with multiple soil parameters cannot provide better predictions than the equations with single soil parameter. In other words, it is not necessary to develop the relationships between compression index and multiple soil parameters. Meanwhile, it was noted that secondary compression index is approximately 0.7-5.0% of primary compression index with an average of 2.0%. In the end, the proposed prediction equations using power regression technique were provided that can provide more accurate predictions than those from existing equations.

Keywords: compression index, clay, settlement, consolidation, secondary compression index, soil parameter

Procedia PDF Downloads 158