Search results for: Cumulative conformance count
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 270

Search results for: Cumulative conformance count

240 Effect of Gating Sprue Height on Mechanical Properties of Thin Wall Ductile Iron

Authors: E. F. Ochulor, S. O. Adeosun, S. A. Balogun

Abstract:

Effect of sprue/metal head height on mould filling, microstructure and mechanical properties of TWDI casting is studied. Results show that metal/sprue height of 50 mm is not sufficient to push the melt through the gating channel, but as it is increased from 100-350 mm, proper mould filling is achieved. However at higher heights between 200 mm and 350 mm, defects associated with incomplete solidification, carbide precipitation and turbulent flow are evident. This research shows that superior UTS, hardness, nodularity and nodule count are obtained at 100 mm sprue height.

Keywords: Melt pressure and velocity, nodularity, nodule count, sprue height.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2865
239 Plug and Play Interferometer Configuration using Single Modulator Technique

Authors: Norshamsuri Ali, Hafizulfika, Salim Ali Al-Kathiri, Abdulla Al-Attas, Suhairi Saharudin, Mohamed Ridza Wahiddin

Abstract:

We demonstrate single-photon interference over 10 km using a plug and play system for quantum key distribution. The quality of the interferometer is measured by using the interferometer visibility. The coding of the signal is based on the phase coding and the value of visibility is based on the interference effect, which result a number of count. The setup gives full control of polarization inside the interferometer. The quality measurement of the interferometer is based on number of count per second and the system produces 94 % visibility in one of the detectors.

Keywords: single photon, interferometer, quantum key distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1570
238 Cumulative Learning based on Dynamic Clustering of Hierarchical Production Rules(HPRs)

Authors: Kamal K.Bharadwaj, Rekha Kandwal

Abstract:

An important structuring mechanism for knowledge bases is building clusters based on the content of their knowledge objects. The objects are clustered based on the principle of maximizing the intraclass similarity and minimizing the interclass similarity. Clustering can also facilitate taxonomy formation, that is, the organization of observations into a hierarchy of classes that group similar events together. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. In this paper, a set of related HPRs is called a cluster and is represented by a HPR-tree. This paper discusses an algorithm based on cumulative learning scenario for dynamic structuring of clusters. The proposed scheme incrementally incorporates new knowledge into the set of clusters from the previous episodes and also maintains summary of clusters as Synopsis to be used in the future episodes. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested incremental structuring of clusters would be useful in mining data streams.

Keywords: Cumulative learning, clustering, data mining, hierarchical production rules.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1393
237 A New Method for Multiobjective Optimization Based on Learning Automata

Authors: M. R. Aghaebrahimi, S. H. Zahiri, M. Amiri

Abstract:

The necessity of solving multi dimensional complicated scientific problems beside the necessity of several objective functions optimization are the most motive reason of born of artificial intelligence and heuristic methods. In this paper, we introduce a new method for multiobjective optimization based on learning automata. In the proposed method, search space divides into separate hyper-cubes and each cube is considered as an action. After gathering of all objective functions with separate weights, the cumulative function is considered as the fitness function. By the application of all the cubes to the cumulative function, we calculate the amount of amplification of each action and the algorithm continues its way to find the best solutions. In this Method, a lateral memory is used to gather the significant points of each iteration of the algorithm. Finally, by considering the domination factor, pareto front is estimated. Results of several experiments show the effectiveness of this method in comparison with genetic algorithm based method.

Keywords: Function optimization, Multiobjective optimization, Learning automata.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
236 A Comparison of Marginal and Joint Generalized Quasi-likelihood Estimating Equations Based On the Com-Poisson GLM: Application to Car Breakdowns Data

Authors: N. Mamode Khan, V. Jowaheer

Abstract:

In this paper, we apply and compare two generalized estimating equation approaches to the analysis of car breakdowns data in Mauritius. Number of breakdowns experienced by a machinery is a highly under-dispersed count random variable and its value can be attributed to the factors related to the mechanical input and output of that machinery. Analyzing such under-dispersed count observation as a function of the explanatory factors has been a challenging problem. In this paper, we aim at estimating the effects of various factors on the number of breakdowns experienced by a passenger car based on a study performed in Mauritius over a year. We remark that the number of passenger car breakdowns is highly under-dispersed. These data are therefore modelled and analyzed using Com-Poisson regression model. We use the two types of quasi-likelihood estimation approaches to estimate the parameters of the model: marginal and joint generalized quasi-likelihood estimating equation approaches. Under-dispersion parameter is estimated to be around 2.14 justifying the appropriateness of Com-Poisson distribution in modelling underdispersed count responses recorded in this study.

Keywords: Breakdowns, under-dispersion, com-poisson, generalized linear model, marginal quasi-likelihood estimation, joint quasi-likelihood estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1420
235 Image Mapping with Cumulative Distribution Function for Quick Convergence of Counter Propagation Neural Networks in Image Compression

Authors: S. Anna Durai, E. Anna Saro

Abstract:

In general the images used for compression are of different types like dark image, high intensity image etc. When these images are compressed using Counter Propagation Neural Network, it takes longer time to converge. The reason for this is that the given image may contain a number of distinct gray levels with narrow difference with their neighborhood pixels. If the gray levels of the pixels in an image and their neighbors are mapped in such a way that the difference in the gray levels of the neighbor with the pixel is minimum, then compression ratio as well as the convergence of the network can be improved. To achieve this, a Cumulative Distribution Function is estimated for the image and it is used to map the image pixels. When the mapped image pixels are used the Counter Propagation Neural Network yield high compression ratio as well as it converges quickly.

Keywords: Correlation, Counter Propagation Neural Networks, Cummulative Distribution Function, Image compression.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
234 Design Alternatives for Lateral Force-Resisting Systems of Tall Buildings in Dubai, UAE

Authors: Mohammad AlHamaydeh, Sherif Yehia, Nader Aly, Ammar Douba, Layane Hamzeh

Abstract:

Four design alternatives for lateral force-resisting systems of tall buildings in Dubai, UAE are presented. Quantitative comparisons between the different designs are also made. This paper is intended to provide different feasible lateral systems to be used in Dubai in light of the available seismic hazard studies of the UAE. The different lateral systems are chosen in conformance with the International Building Code (IBC). Moreover, the expected behavior of each system is highlighted and light is shed on some of the cost implications associated with lateral system selection.

Keywords: Concrete, Dual, Dubai UAE Seismicity, Special Moment-Resisting Frames (SMRF), Special Shear Wall, Steel

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3477
233 A New Method in Short-Term Heart Rate Variability — Five-Class Density Histogram

Authors: Liping Li, Ke Li, Changchun Liu, Chengyu Liu, Yuanyang Li

Abstract:

A five-class density histogram with an index named cumulative density was proposed to analyze the short-term HRV. 150 subjects participated in the test, falling into three groups with equal numbers -- the healthy young group (Young), the healthy old group (Old), and the group of patients with congestive heart failure (CHF). Results of multiple comparisons showed a significant differences of the cumulative density in the three groups, with values 0.0238 for Young, 0.0406 for Old and 0.0732 for CHF (p<0.001). After 7 days and 14 days, 46 subjects from the Young and Old groups were retested twice following the same test protocol. Results showed good-to-excellent interclass correlations (ICC=0.783, 95% confidence interval 0.676-0.864). The Bland-Altman plots were used to reexamine the test-retest reliability. In conclusion, the method proposed could be a valid and reliable method to the short-term HRV assessment.

Keywords: Autonomic nervous system, congestive heart failure, heart rate variability, histogram.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1950
232 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano

Authors: Guo Wenyu, Qu Youli

Abstract:

A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.

Keywords: Compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1034
231 Underlying Cognitive Complexity Measure Computation with Combinatorial Rules

Authors: Benjapol Auprasert, Yachai Limpiyakorn

Abstract:

Measuring the complexity of software has been an insoluble problem in software engineering. Complexity measures can be used to predict critical information about testability, reliability, and maintainability of software systems from automatic analysis of the source code. During the past few years, many complexity measures have been invented based on the emerging Cognitive Informatics discipline. These software complexity measures, including cognitive functional size, lend themselves to the approach of the total cognitive weights of basic control structures such as loops and branches. This paper shows that the current existing calculation method can generate different results that are algebraically equivalence. However, analysis of the combinatorial meanings of this calculation method shows significant flaw of the measure, which also explains why it does not satisfy Weyuker's properties. Based on the findings, improvement directions, such as measures fusion, and cumulative variable counting scheme are suggested to enhance the effectiveness of cognitive complexity measures.

Keywords: Cognitive Complexity Measure, Cognitive Weight of Basic Control Structure, Counting Rules, Cumulative Variable Counting Scheme.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1842
230 Effects of Microwave Heating on Biogas Production, Chemical Oxygen Demand and Volatile Solids Solubilization of Food Residues

Authors: Ackmez Mudhoo, Pravish Rye Moorateeah, Romeela Mohee

Abstract:

This paper presents the results of the preliminary investigation of microwave (MW) irradiation pretreatments on the anaerobic digestion of food residues using biochemical methane potential (BMP) assays. Low solids systems with a total solids (TS) content ranging from 5.0-10.0% were analyzed. The inoculum to bulk mass of substrates to water ratio was 1:2:2 (mass basis). The experimental conditions for pretreatments were as follows: a control (no MW irradiation), two runs with MW irradiation for 15 and 30 minutes at 320 W, and another two runs with MW irradiation at 528 W for 30 and 60 minutes. The cumulative biogas production were 6.3 L and 8.7 L for 15min/320 W and 30min/320 W MW irradiation conditions, respectively, and 10.5 L and 11.4 L biogas for 30min/528 W and 60min/528 W, respectively, as compared to the control giving 5.8 L biogas. Both an increase in exposure time of irradiation and power of MW had increased the rate and yield of biogas. Singlefactor ANOVA tests (p<0.05) indicated that the variations in VS, TS, COD and cumulative biogas generation were significantly different for the pretreatment conditions. Results from this study indicated that MW irradiation had enhanced the biogas production and degradation of total solids with a significant improvement in VS and COD solubilization.

Keywords: microwave irradiation, pretreatment, anaerobic digestion, food residues.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2397
229 A Cumulative Learning Approach to Data Mining Employing Censored Production Rules (CPRs)

Authors: Rekha Kandwal, Kamal K.Bharadwaj

Abstract:

Knowledge is indispensable but voluminous knowledge becomes a bottleneck for efficient processing. A great challenge for data mining activity is the generation of large number of potential rules as a result of mining process. In fact sometimes result size is comparable to the original data. Traditional data mining pruning activities such as support do not sufficiently reduce the huge rule space. Moreover, many practical applications are characterized by continual change of data and knowledge, thereby making knowledge voluminous with each change. The most predominant representation of the discovered knowledge is the standard Production Rules (PRs) in the form If P Then D. Michalski & Winston proposed Censored Production Rules (CPRs), as an extension of production rules, that exhibit variable precision and supports an efficient mechanism for handling exceptions. A CPR is an augmented production rule of the form: If P Then D Unless C, where C (Censor) is an exception to the rule. Such rules are employed in situations in which the conditional statement 'If P Then D' holds frequently and the assertion C holds rarely. By using a rule of this type we are free to ignore the exception conditions, when the resources needed to establish its presence, are tight or there is simply no information available as to whether it holds or not. Thus the 'If P Then D' part of the CPR expresses important information while the Unless C part acts only as a switch changes the polarity of D to ~D. In this paper a scheme based on Dempster-Shafer Theory (DST) interpretation of a CPR is suggested for discovering CPRs from the discovered flat PRs. The discovery of CPRs from flat rules would result in considerable reduction of the already discovered rules. The proposed scheme incrementally incorporates new knowledge and also reduces the size of knowledge base considerably with each episode. Examples are given to demonstrate the behaviour of the proposed scheme. The suggested cumulative learning scheme would be useful in mining data streams.

Keywords: Censored production rules, cumulative learning, data mining, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
228 Microneedles-Mediated Transdermal Delivery

Authors: M. Petchsangsai, N. Wonglertnirant, T. Rojanarata, P. Opanasopit, T. Ngawhirunpat

Abstract:

The objective of the present study was to evaluate the potential of hollow microneedles for enhancing the transdermal delivery of Bovine Serum Albumin (MW~66,000 Da)-Fluorescein Isothiocyanate (BSA-FITC) conjugate, a hydrophilic large molecular compound. Moreover, the effect of different formulations was evaluated. The series of binary mixtures composed of propylene glycol (PG) and pH 7.4 phosphate buffer solution (PBS) was prepared and used as a medium for BSA-FITC. The results showed that there was no permeation of BSA-FITC solution across the neonatal porcine skin without using hollow microneedles, whereas the cumulative amount of BSA-FITC released at 8 h through the neonatal porcine skin was about 60-70% when using hollow microneedles. Furthermore, the results demonstrated that the higher volume of PG in binary mixtures injected, the lower cumulative amount of BSA-FITC released and release rate of BSA-FITC from skin. These release profiles of BSA-FITC in binary mixtures were expressed by Fick-s law of diffusion. These results suggest the utilization of hollow microneedle to enhance transdermal delivery of protein and provide useful information for designing an effective hollow microneedle system.

Keywords: Hydrophilic macromolecules, Microneedles, Propylene glycol, Transdermal drug delivery

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217
227 Contrast Enhancement of Color Images with Color Morphing Approach

Authors: Javed Khan, Aamir Saeed Malik, Nidal Kamel, Sarat Chandra Dass, Azura Mohd Affandi

Abstract:

Low contrast images can result from the wrong setting of image acquisition or poor illumination conditions. Such images may not be visually appealing and can be difficult for feature extraction. Contrast enhancement of color images can be useful in medical area for visual inspection. In this paper, a new technique is proposed to improve the contrast of color images. The RGB (red, green, blue) color image is transformed into normalized RGB color space. Adaptive histogram equalization technique is applied to each of the three channels of normalized RGB color space. The corresponding channels in the original image (low contrast) and that of contrast enhanced image with adaptive histogram equalization (AHE) are morphed together in proper proportions. The proposed technique is tested on seventy color images of acne patients. The results of the proposed technique are analyzed using cumulative variance and contrast improvement factor measures. The results are also compared with decorrelation stretch. Both subjective and quantitative analysis demonstrates that the proposed techniques outperform the other techniques.

Keywords: Contrast enhancement, normalized RGB, adaptive histogram equalization, cumulative variance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1062
226 Comparison of Frequency Converter Outages: A Case Study on the Swedish TPS System

Authors: Y. A. Mahmood, A. Ahmadi, R. Karim, U. Kumar, A.K. Verma, N. Fransson

Abstract:

The purpose of this paper isunavailability of the two main types of conveSwedish traction power supply (TPS) system, i.e.static converter. The number of outages and the ouused to analyze and compare the unavailability oconverters. The mean cumulative function (MCF)analyze the number of outages and the unavailabthe forced outage rate (FOR) concept has been uoutage rates. The study shows that the outagesfailure occur at a constant rate by calendar timconverter stations, while very few stations havedecreasing rate. It has also been found that the stata higher number of outages and a higher outage ratcompared to the rotary converter types. The resultsthat combining the number of outages and the fgives a better view of the converters performasupport for the maintenance decision. In fact, usingdoes not reflect reality. Comparing these two indein identifying the areas where extra resources are maintenance planning and where improvementsoutage in the TPS system.KeywordsFrequency Converter, Forced OuCumulative Function, Traction Power Supply, ESystems.

Keywords: Frequency Converter, Forced Outage Rate, Mean Cumulative Function, Traction Power Supply, Electrified Railway Systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2170
225 A Simple Qos Scheduler for Mobile Wimax

Authors: Komala Kalyanam, Pushpam Indumathi

Abstract:

WiMAX is defined as Worldwide Interoperability for Microwave Access by the WiMAX Forum, formed in June 2001 to promote conformance and interoperability of the IEEE 802.16 standard, officially known as WirelessMAN. The attractive features of WiMAX technology are very high throughput and Broadband Wireless Access over a long distance. A detailed simulation environment is demonstrated with the UGS, nrtPS and ertPS service classes for throughput, delay and packet delivery ratio for a mixed environment of fixed and mobile WiMAX. A simple mobility aspect is considered for the mobile WiMAX and the PMP mode of transmission is considered in TDD mode. The Network Simulator 2 (NS-2) is the tool which is used to simulate the WiMAX network scenario. A simple Priority Scheduler and Weighted Round Robin Schedulers are the WiMAX schedulers used in the research work

Keywords: ertPS, Mobile WiMAX, scheduler.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1864
224 Power and Delay Optimized Graph Representation for Combinational Logic Circuits

Authors: Padmanabhan Balasubramanian, Karthik Anantha

Abstract:

Structural representation and technology mapping of a Boolean function is an important problem in the design of nonregenerative digital logic circuits (also called combinational logic circuits). Library aware function manipulation offers a solution to this problem. Compact multi-level representation of binary networks, based on simple circuit structures, such as AND-Inverter Graphs (AIG) [1] [5], NAND Graphs, OR-Inverter Graphs (OIG), AND-OR Graphs (AOG), AND-OR-Inverter Graphs (AOIG), AND-XORInverter Graphs, Reduced Boolean Circuits [8] does exist in literature. In this work, we discuss a novel and efficient graph realization for combinational logic circuits, represented using a NAND-NOR-Inverter Graph (NNIG), which is composed of only two-input NAND (NAND2), NOR (NOR2) and inverter (INV) cells. The networks are constructed on the basis of irredundant disjunctive and conjunctive normal forms, after factoring, comprising terms with minimum support. Construction of a NNIG for a non-regenerative function in normal form would be straightforward, whereas for the complementary phase, it would be developed by considering a virtual instance of the function. However, the choice of best NNIG for a given function would be based upon literal count, cell count and DAG node count of the implementation at the technology independent stage. In case of a tie, the final decision would be made after extracting the physical design parameters. We have considered AIG representation for reduced disjunctive normal form and the best of OIG/AOG/AOIG for the minimized conjunctive normal forms. This is necessitated due to the nature of certain functions, such as Achilles- heel functions. NNIGs are found to exhibit 3.97% lesser node count compared to AIGs and OIG/AOG/AOIGs; consume 23.74% and 10.79% lesser library cells than AIGs and OIG/AOG/AOIGs for the various samples considered. We compare the power efficiency and delay improvement achieved by optimal NNIGs over minimal AIGs and OIG/AOG/AOIGs for various case studies. In comparison with functionally equivalent, irredundant and compact AIGs, NNIGs report mean savings in power and delay of 43.71% and 25.85% respectively, after technology mapping with a 0.35 micron TSMC CMOS process. For a comparison with OIG/AOG/AOIGs, NNIGs demonstrate average savings in power and delay by 47.51% and 24.83%. With respect to device count needed for implementation with static CMOS logic style, NNIGs utilize 37.85% and 33.95% lesser transistors than their AIG and OIG/AOG/AOIG counterparts.

Keywords: AND-Inverter Graph, OR-Inverter Graph, DirectedAcyclic Graph, Low power design, Delay optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2011
223 Energy Loss Reduction in Oil Refineries through Flare Gas Recovery Approaches

Authors: Majid Amidpour, Parisa Karimi, Marzieh Joda

Abstract:

For the last few years, release of burned undesirable by-products has become a challenging issue in oil industries. Flaring, as one of the main sources of air contamination, involves detrimental and long-lasting effects on human health and is considered a substantial reason for energy losses worldwide. This research involves studying the implications of two main flare gas recovery methods at three oil refineries, all in Iran as the case I, case II, and case III in which the production capacities are increasing respectively. In the proposed methods, flare gases are converted into more valuable products, before combustion by the flare networks. The first approach involves collecting, compressing and converting the flare gas to smokeless fuel which can be used in the fuel gas system of the refineries. The other scenario includes utilizing the flare gas as a feed into liquefied petroleum gas (LPG) production unit already established in the refineries. The processes of these scenarios are simulated, and the capital investment is calculated for each procedure. The cumulative profits of the scenarios are evaluated using Net Present Value method. Furthermore, the sensitivity analysis based on total propane and butane mole fraction is carried out to make a rational comparison for LPG production approach, and the results are illustrated for different mole fractions of propane and butane. As the mole fraction of propane and butane contained in LPG differs in summer and winter seasons, the results corresponding to LPG scenario are demonstrated for each season. The results of the simulations show that cumulative profit in fuel gas production scenario and LPG production rate increase with the capacity of the refineries. Moreover, the investment return time in LPG production method experiences a decline, followed by a rising trend with an increase in C3 and C4 content. The minimum value of time return occurs at propane and butane sum concentration values of 0.7, 0.6, and 0.7 in case I, II, and III, respectively. Based on comparison of the time of investment return and cumulative profit, fuel gas production is the superior scenario for three case studies.

Keywords: Flare gas reduction, liquefied petroleum gas, fuel gas, net present value method, sensitivity analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 706
222 The Effect of Nano-Silver Packaging on Quality Maintenance of Fresh Strawberry

Authors: Naser Valipour Motlagh, Majid Aliabadi, Elnaz Rahmani, Samira Ghorbanpour

Abstract:

Strawberry is one of the most favored fruits all along the world. But due to its vulnerability to microbial contamination and short life storage, there are lots of problems in industrial production and transportation of this fruit. Therefore, lots of ideas have tried to increase the storage life of strawberries especially through proper packaging. This paper works on efficient packaging as well. The primary material used is produced through simple mixing of low-density polyethylene (LDPE) and silver nanoparticles in different weight fractions of 0.5 and 1% in presence of dicumyl peroxide as a cross-linking agent. Final packages were made in a twin-screw extruder. Then, their effect on the quality maintenance of strawberry is evaluated. The SEM images of nano-silver packages show the distribution of silver nanoparticles in the packages. Total bacteria count, mold, yeast and E. coli are measured for microbial evaluation of all samples. Texture, color, appearance, odor, taste and total acceptance of various samples are evaluated by trained panelists and based on 9-point hedonic scale method. The results show a decrease in total bacteria count and mold in nano-silver packages compared to the samples packed in polyethylene packages for the same storage time. The optimum concentration of silver nanoparticles for the lowest bacteria count and mold is predicted to be around 0.5% which has attained the most acceptance from the panelist as well. Moreover, organoleptic properties of strawberry are preserved for a longer period in nano-silver packages. It can be concluded that using nano-silver particles in strawberry packages has improved the storage life and quality maintenance of the fruit.

Keywords: Antimicrobial properties, polyethylene, silver nanoparticles, strawberry.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 946
221 Effect of Progressive Type-I Right Censoring on Bayesian Statistical Inference of Simple Step–Stress Acceleration Life Testing Plan under Weibull Life Distribution

Authors: Saleem Z. Ramadan

Abstract:

This paper discusses the effects of using progressive Type-I right censoring on the design of the Simple Step Accelerated Life testing using Bayesian approach for Weibull life products under the assumption of cumulative exposure model. The optimization criterion used in this paper is to minimize the expected pre-posterior variance of the Pth percentile time of failures. The model variables are the stress changing time and the stress value for the first step. A comparison between the conventional and the progressive Type-I right censoring is provided. The results have shown that the progressive Type-I right censoring reduces the cost of testing on the expense of the test precision when the sample size is small. Moreover, the results have shown that using strong priors or large sample size reduces the sensitivity of the test precision to the censoring proportion. Hence, the progressive Type-I right censoring is recommended in these cases as progressive Type-I right censoring reduces the cost of the test and doesn't affect the precision of the test a lot. Moreover, the results have shown that using direct or indirect priors affects the precision of the test.

Keywords: Reliability, Accelerated life testing, Cumulative exposure model, Bayesian estimation, Progressive Type-I censoring, Weibull distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2117
220 Leachate Generation from Landfill Lysimeter using Different Types of Soil Cover

Authors: S. Karnchanawong, P. Yongpisalpop

Abstract:

The objectives of this study are to determine the effects of soil cover type on characteristics of leachates generated from landfill lysimeters. Four lysimeters with diameter and height of 0.15 and 3.00 m, respectively, were prepared. Three lysimeters were filled with municipal waste and three different cover soil types i.e. sandy loam soil, silty loam soil and clay soil while another lysimeter was filled solely with municipal waste. The study was conducted in the rainy season. Leachate quantities were measured every day and leachate characteristics were determined once a week. The cumulative leachate quantity from the lysimeter filled solely with municipal waste was found to be around 27% higher than the lysimeters using cover soils. There were no any differences of the cumulative leachate amounts generated from the lysimeters using three types of soils. The comparison of the total mass of pollutants generated from all lysimeters showed that the lysimeter filled solely with municipal waste generated the maximum quantities of pollutants. Among the lysimeters using different types of soils, the lysimeter using sandy loam soil generated the lowest amount of most of pollutants, compared with the lysimeters using silty loam and clay soils. It can be concluded that in term of pollutant attenuation in the leachate, a sandy loam is the most suitable soil to be used as a cover soil in the landfill.

Keywords: cover soil, leachate, sandy loam soil, silty loam soil, clay soil.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2095
219 Revisiting the Concept of Risk Analysis within the Context of Geospatial Database Design: A Collaborative Framework

Authors: J. Grira, Y. Bédard, S. Roche

Abstract:

The aim of this research is to design a collaborative framework that integrates risk analysis activities into the geospatial database design (GDD) process. Risk analysis is rarely undertaken iteratively as part of the present GDD methods in conformance to requirement engineering (RE) guidelines and risk standards. Accordingly, when risk analysis is performed during the GDD, some foreseeable risks may be overlooked and not reach the output specifications especially when user intentions are not systematically collected. This may lead to ill-defined requirements and ultimately in higher risks of geospatial data misuse. The adopted approach consists of 1) reviewing risk analysis process within the scope of RE and GDD, 2) analyzing the challenges of risk analysis within the context of GDD, and 3) presenting the components of a risk-based collaborative framework that improves the collection of the intended/forbidden usages of the data and helps geo-IT experts to discover implicit requirements and risks.

Keywords: Collaborative risk analysis, intention of use, Geospatial database design, Geospatial data misuse.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1607
218 Physicochemical and Microbiological Properties of Kefir, Kefir Yogurt and Chickpea Yogurt

Authors: Nuray Güzeler, Elif Ari, Gözde Konuray, Çağla Özbek

Abstract:

The consumption of functional foods is very common. For this reason, many products which are probiotic, prebiotic, energy reduced and fat reduced are developed. In this research, physicochemical and microbiological properties of functional kefir, kefir yogurt and chickpea yogurt were examined. For this purpose, pH values, titration acidities, viscosity values, water holding capacities, serum separation values, acetaldehyde contents, tyrosine contents, the count of aerobic mesophilic bacteria, lactic acid bacteria count and mold-yeast counts were determined. As a result of performed analysis, the differences between titration acidities, serum separation values, water holding capacities, acetaldehyde and tyrosine contents of samples were statistically significant (p < 0.05). There were no significant differences on pH values, viscosities, and microbiological properties of samples (p > 0.05). Consequently industrial production of functional kefir yogurt and chickpea yogurt may be advised.

Keywords: Chickpea yogurt, kefir, kefir yogurt, milk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 720
217 The Relations of Volatile Compounds, Some Parameters and Consumer Preference of Commercial Fermented Milks in Thailand

Authors: Suttipong Phosuksirikul, Rawichar Chaipojjana, Arunsri Leejeerajumnean

Abstract:

The aim of research was to define the relations between volatile compounds, some parameters (pH, titratable acidity (TA), total soluble solid (TSS), lactic acid bacteria count) and consumer preference of commercial fermented milks. These relations tend to be used for controlling and developing new fermented milk product. Three leading commercial brands of fermented milks in Thailand were evaluated by consumers (n=71) using hedonic scale for four attributes (sweetness, sourness, flavour, and overall liking), volatile compounds using headspace-solid phase microextraction (HS-SPME) GC-MS, pH, TA, TSS and LAB count. Then the relations were analyzed by principal component analysis (PCA). The PCA data showed that all of four attributes liking scores were related to each other. They were also related to TA, TSS and volatile compounds. The related volatile compounds were mainly on fermented produced compounds including acetic acid, furanmethanol, furfural, octanoic acid and the volatiles known as artificial fruit flavour (beta pinene, limonene, vanillin, and ethyl vanillin). These compounds were provided the information about flavour addition in commercial fermented milk in Thailand.

Keywords: Fermented milk, volatile compounds, preference, PCA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
216 On the Numbers of Various Young Tableaux

Authors: Hsuan-Chu Li

Abstract:

We demonstrate a way to count the number of Young tableau u of shape λ = (k, k,L, k) with | λ |= lk by expanding Schur function. This result gives an answer to the question that was put out by Jenny Buontempo and Brian Hopkins.

Keywords: Young tableau, Schur function.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1104
215 Study of Sickle Cell Syndromes in the Population of the Region of Batna

Authors: K .Belhadi, H. Bousselsela, M. Yahia, A. Zidani, S. Benbia

Abstract:

Sickle cell anemia is a recessive genetic disease caused by the presence in the red blood cell, of abnormal hemoglobin called hemoglobin S. It results from the replacement in the beta chain of the acid glutamic acid by valin at position 6. Topics may be homozygous (SS) or heterozygous (AS) most often asymptomatic. Other mutations result in compound heterozygous: - Synthesis of hemoglobin C mutation in the sixth leucin codon (heterozygous SC); - ß-thalassemia (heterozygous S-ß thalassemia). SS homozygous, heterozygous SC and S- ß -thalassemia are grouped under the major sickle cell syndromes. To make a laboratory diagnosis of hemoglobinopathies in a portion of the population in region of Batna, our study was conducted on 115 patients with suspected sickle cell anemia, all cases have benefited from hematological tests as blood count (count RBC, calculated erythrocyte indices, MCV and MCHC, measuring the hemoglobin concentration) and a biochemical test in this case electrophoresis CAPILLARYS HEMOGLOBIN (E). The results showed: 27 cases of sickle cell anemia were found on 115 suspected cases, 73,03% homozygous sickle cell disease and 59,25% sickle cell trait. Finally, the double heterozygous S/C, represent the incidence rate of 3, 70%.

Keywords: Hemoglobin, sickle cell syndromes, laboratory diagnosis

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1497
214 Effect of Bacillus subtilis Pb6 on Growth and Gut Microflora in Clostridium perfringens Challenged Broilers

Authors: A. Khalique, T. Naseem, N. Haque, Z. Rasool

Abstract:

The objective of current study was to investigate the effect of Bacillus subtilis PB6 (CloSTAT) as a probiotic in broilers. The corn-soybean based diet was divided into four treatment groups; T1 (basal diet with no probiotic and no Clostridium perfringens); T2 (basal diet challenged with C. perfringens without probiotic); T3 (basal diet challenged with C. perfringens having 0.05% probiotic); T4 (basal diet challenged with C. perfringens having 0.1% probiotic). Every treatment group had four replicates with 24 birds each. Body weight and feed intake were measured on weekly basis, while ileal bacterial count was recorded on day-28 following Clostridium perfringens challenge. The 0.1% probiotic treatment showed 7.2% increase in average feed intake (P=0.05) and 8% increase in body weight compared to T2. In 0.1% treatment body weight was 5% higher than T3 (P=0.02). It was also observed that 0.1% treatment had improved feed conversion ratio (1.77) on 6th week. No effect of treatment was observed on mortality and ileal bacterial count. The current study indicated that 0.1% use of probiotic had positive response in C. perfringens challenged broilers.

Keywords: Bacillus subtilis PB6, antibiotic growth promoters, Clostridium perfringens, CloSTAT, broilers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1065
213 Exploring Students’ Self-Evaluation on Their Learning Outcomes through an Integrated Cumulative Grade Point Average Reporting Mechanism

Authors: Suriyani Ariffin, Nor Aziah Alias, Khairil Iskandar Othman, Haslinda Yusoff

Abstract:

An Integrated Cumulative Grade Point Average (iCGPA) is a mechanism and strategy to ensure the curriculum of an academic programme is constructively aligned to the expected learning outcomes and student performance based on the attainment of those learning outcomes that is reported objectively in a spider web. Much effort and time has been spent to develop a viable mechanism and trains academics to utilize the platform for reporting. The question is: How well do learners conceive the idea of their achievement via iCGPA and whether quality learner attributes have been nurtured through the iCGPA mechanism? This paper presents the architecture of an integrated CGPA mechanism purported to address a holistic evaluation from the evaluation of courses learning outcomes to aligned programme learning outcomes attainment. The paper then discusses the students’ understanding of the mechanism and evaluation of their achievement from the generated spider web. A set of questionnaires were distributed to a group of students with iCGPA reporting and frequency analysis was used to compare the perspectives of students on their performance. In addition, the questionnaire also explored how they conceive the idea of an integrated, holistic reporting and how it generates their motivation to improve. The iCGPA group was found to be receptive to what they have achieved throughout their study period. They agreed that the achievement level generated from their spider web allows them to develop intervention and enhance the programme learning outcomes before they graduate.

Keywords: Learning outcomes attainment, iCGPA, programme learning outcomes, spider web, iCGPA reporting skills.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 711
212 Working Children and Adolescents and the Vicious Circle of Poverty from the Perspective of Gunnar Myrdal’s Theory of Circular Cumulative Causation: Analysis and Implementation of a Probit Model to Brazil

Authors: J. Leige Lopes, L. Aparecida Bastos, R. Monteiro da Silva

Abstract:

The objective of this paper is to study the work of children and adolescents and the vicious circle of poverty from the perspective of Guinar Myrdal’s Theory of Circular Cumulative Causation. The objective is to show that if a person starts working in the juvenile phase of life they will be classified as poor or extremely poor when they are adult, which can to be observed in the case of Brazil, more specifically in the north and northeast. To do this, the methodology used was statistical and econometric analysis by applying a probit model. The main results show that: if people reside in the northeastern region of Brazil, and if they have a low educational level and if they start their professional life before the age 18, they will increase the likelihood that they will be poor or extremely poor. There is a consensus in the literature that one of the causes of the intergenerational transmission of poverty is related to child labor, this because when one starts their professional life while still in the toddler or adolescence stages of life, they end up sacrificing their studies. Because of their low level of education, children or adolescents are forced to perform low-paid functions and abandon school, becoming in the future, people who will be classified as poor or extremely poor. As a result of poverty, parents may be forced to send their children out to work when they are young, so that in the future they will also become poor adults, a process that is characterized as the "vicious circle of poverty."

Keywords: Children, adolescents, Gunnar Myrdal, poverty, vicious circle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
211 The Effect of Magnetite Particle Size on Methane Production by Fresh and Degassed Anaerobic Sludge

Authors: E. Al-Essa, R. Bello-Mendoza, D. G. Wareham

Abstract:

Anaerobic batch experiments were conducted to investigate the effect of magnetite-supplementation (7 mM) on methane production from digested sludge undergoing two different microbial growth phases, namely fresh sludge (exponential growth phase) and degassed sludge (endogenous decay phase). Three different particle sizes were assessed: small (50 - 150 nm), medium (168 – 490 nm) and large (800 nm - 4.5 µm) particles. Results show that, in the case of the fresh sludge, magnetite significantly enhanced the methane production rate (up to 32%) and reduced the lag phase (by 15% - 41%) as compared to the control, regardless of the particle size used. However, the cumulative methane produced at the end of the incubation was comparable in all treatment and control bottles. In the case of the degassed sludge, only the medium-sized magnetite particles increased significantly the methane production rate (12% higher) as compared to the control. Small and large particles had little effect on the methane production rate but did result in an extended lag phase which led to significantly lower cumulative methane production at the end of the incubation period. These results suggest that magnetite produces a clear and positive effect on methane production only when an active and balanced microbial community is present in the anaerobic digester. It is concluded that, (i) the effect of magnetite particle size on increasing the methane production rate and reducing lag phase duration is strongly influenced by the initial metabolic state of the microbial consortium, and (ii) the particle size would positively affect the methane production if it is provided within the nanometer size range.

Keywords: Anaerobic digestion, iron oxide (Fe3O4), methanogenesis, nanoparticle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 733