Search results for: minimum quantity lubrication
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1201

Search results for: minimum quantity lubrication

271 Structure and Power Struggle in Contemporary Nollywood: An Ethnographic Evaluation

Authors: Ezinne M. Igwe

Abstract:

Statements of facts have been made about Nollywood, a segment of the Nigerian film industry that has in recent times become phenomenal due largely to its quantity of production and specific production style. In the face of recent transformations reshaping the industry, matters have been arising which have not been given due academic attention from an industry player perspective. While re-addressing such issues like structure, policy and informality, this study benefits from a new perspective – that of a community member adopting participant observation to research into a familiar culture. With data drawn from an extensive ethnographic study of the industry, this paper examines these matters with an emphasis on structure and the industry’s overall political economy. Drawing from discourses on the new and old Nollywood labels and other current matters arising within the industry such as the MOPICON bill redraft, corporate financing and possibilities of regeneration, this paper examines structure and power struggle within Nollywood. These are championing regenerative processes that bring about formalization, professionalism and the quest for a transnational presence, which have only been superficially evaluated. Focused essentially on Nollywood’s political economy, this study critically analyses the transforming face of an informal industry, the consistent quest for structure, quality and standard, and issues of corporate sponsorship as possible trends of regeneration. It evaluates them as indicators of regeneration, questioning the possibilities of their sustenance in an industry experiencing increased interactions with the formal economy and an influx of young professionals. With findings that make sustained regeneration both certain (due to increased formal economy interaction) and uncertain (due to the dysfunctionality of the society and its political system), it concludes that the transforming face of the industry suggests impending gentrification of the industry.

Keywords: Formalization, MOPICON, Nollywood, Structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1315
270 Shape Optimization of Permanent Magnet Motors Using the Reduced Basis Technique

Authors: A. Jabbari, M. Shakeri, A. Nabavi

Abstract:

In this paper, a tooth shape optimization method for cogging torque reduction in Permanent Magnet (PM) motors is developed by using the Reduced Basis Technique (RBT) coupled by Finite Element Analysis (FEA) and Design of Experiments (DOE) methods. The primary objective of the method is to reduce the enormous number of design variables required to define the tooth shape. RBT is a weighted combination of several basis shapes. The aim of the method is to find the best combination using the weights for each tooth shape as the design variables. A multi-level design process is developed to find suitable basis shapes or trial shapes at each level that can be used in the reduced basis technique. Each level is treated as a separated optimization problem until the required objective – minimum cogging torque – is achieved. The process is started with geometrically simple basis shapes that are defined by their shape co-ordinates. The experimental design of Taguchi method is used to build the approximation model and to perform optimization. This method is demonstrated on the tooth shape optimization of a 8-poles/12-slots PM motor.

Keywords: PM motor, cogging torque, tooth shape optimization, RBT, FEA, DOE.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2470
269 Characterization of Biocomposites Based on Mussel Shell Wastes

Authors: Suheyla Kocaman, Gulnare Ahmetli, Alaaddin Cerit, Alize Yucel, Merve Gozukucuk

Abstract:

Shell wastes represent a considerable quantity of byproducts in the shellfish aquaculture. From the viewpoint of ecofriendly and economical disposal, it is highly desirable to convert these residues into high value-added products for industrial applications. So far, the utilization of shell wastes was confined at relatively lower levels, e.g. wastewater decontaminant, soil conditioner, fertilizer constituent, feed additive and liming agent. Shell wastes consist of calcium carbonate and organic matrices, with the former accounting for 95-99% by weight. Being the richest source of biogenic CaCO3, shell wastes are suitable to prepare high purity CaCO3 powders, which have been extensively applied in various industrial products, such as paper, rubber, paints and pharmaceuticals. Furthermore, the shell waste could be further processed to be the filler of polymer composites. This paper presents a study on the potential use of mussel shell waste as biofiller to produce the composite materials with different epoxy matrices, such as bisphenol-A type, CTBN modified and polyurethane modified epoxy resins. Morphology and mechanical properties of shell particles reinforced epoxy composites were evaluated to assess the possibility of using it as a new material. The effects of shell particle content on the mechanical properties of the composites were investigated. It was shown that in all composites, the tensile strength and Young’s modulus values increase with the increase of mussel shell particles content from 10 wt% to 50 wt%, while the elongation at break decreased, compared to pure epoxy resin. The highest Young’s modulus values were determined for bisphenol-A type epoxy composites.

Keywords: Biocomposite, epoxy resin, mussel shell, mechanical properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2879
268 Reliability Analysis of P-I Diagram Formula for RC Column Subjected to Blast Load

Authors: Masoud Abedini, Azrul A. Mutalib, Shahrizan Baharom, Hong Hao

Abstract:

This study was conducted published to investigate there liability of the equation pressure-impulse (PI) reinforced concrete column inprevious studies. Equation involves three different levels of damage criteria known as D =0. 2, D =0. 5 and D =0. 8.The damage criteria known as a minor when 0-0.2, 0.2-0.5is known as moderate damage, high damage known as 0.5-0.8, and 0.8-1 of the structure is considered a failure. In this study, two types of reliability analyzes conducted. First, using pressure-impulse equation with different parameters. The parameters involved are the concrete strength, depth, width, and height column, the ratio of longitudinal reinforcement and transverse reinforcement ratio. In the first analysis of the reliability of this new equation is derived to improve the previous equations. The second reliability analysis involves three types of columns used to derive the PI curve diagram using the derived equation to compare with the equation derived from other researchers and graph minimum standoff versus weapon yield Federal Emergency Management Agency (FEMA). The results showed that the derived equation is more accurate with FEMA standards than previous researchers.

Keywords: Blast load, RC column, P-I curve, Analytical formulae, Standard FEMA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2871
267 Transient Stability Assessment Using Fuzzy SVM and Modified Preventive Control

Authors: B. Dora Arul Selvi, .N. Kamaraj

Abstract:

Transient Stability is an important issue in power systems planning, operation and extension. The objective of transient stability analysis problem is not satisfied with mere transient instability detection or evaluation and it is most important to complement it by defining fast and efficient control measures in order to ensure system security. This paper presents a new Fuzzy Support Vector Machines (FSVM) to investigate the stability status of power systems and a modified generation rescheduling scheme to bring back the identified unstable cases to a more economical and stable operating point. FSVM improves the traditional SVM (Support Vector Machines) by adding fuzzy membership to each training sample to indicate the degree of membership of this sample to different classes. The preventive control based on economic generator rescheduling avoids the instability of the power systems with minimum change in operating cost under disturbed conditions. Numerical results on the New England 39 bus test system show the effectiveness of the proposed method.

Keywords: Fuzzy Support Vector Machine (FSVM), Incremental Cost, Preventive Control, Transient stability

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
266 Network Reconfiguration for Load Balancing in Distribution System with Distributed Generation and Capacitor Placement

Authors: T. Lantharthong, N. Rugthaicharoencheep

Abstract:

This paper presents an efficient algorithm for optimization of radial distribution systems by a network reconfiguration to balance feeder loads and eliminate overload conditions. The system load-balancing index is used to determine the loading conditions of the system and maximum system loading capacity. The index value has to be minimum in the optimal network reconfiguration of load balancing. A method based on Tabu search algorithm, The Tabu search algorithm is employed to search for the optimal network reconfiguration. The basic idea behind the search is a move from a current solution to its neighborhood by effectively utilizing a memory to provide an efficient search for optimality. It presents low computational effort and is able to find good quality configurations. Simulation results for a radial 69-bus system with distributed generations and capacitors placement. The study results show that the optimal on/off patterns of the switches can be identified to give the best network reconfiguration involving balancing of feeder loads while respecting all the constraints.

Keywords: Network reconfiguration, Distributed generation Capacitor placement, Load balancing, Optimization technique

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4187
265 A New Approach to Face Recognition Using Dual Dimension Reduction

Authors: M. Almas Anjum, M. Younus Javed, A. Basit

Abstract:

In this paper a new approach to face recognition is presented that achieves double dimension reduction, making the system computationally efficient with better recognition results and out perform common DCT technique of face recognition. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results change with change in face image resolution and provide optimal results when arriving at a certain resolution level. In the proposed model of face recognition, initially image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to increased computational speed and feature extraction potential of Discrete Cosine Transform (DCT), it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A tradeoff between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL , Yale and EME color database.

Keywords: Biometrics, DCT, Face Recognition, Illumination, Computation, Feature extraction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1654
264 Determination of Safety Distance Around Gas Pipelines Using Numerical Methods

Authors: Omid Adibi, Nategheh Najafpour, Bijan Farhanieh, Hossein Afshin

Abstract:

Energy transmission pipelines are one of the most vital parts of each country which several strict laws have been conducted to enhance the safety of these lines and their vicinity. One of these laws is the safety distance around high pressure gas pipelines. Safety distance refers to the minimum distance from the pipeline where people and equipment do not confront with serious damages. In the present study, safety distance around high pressure gas transmission pipelines were determined by using numerical methods. For this purpose, gas leakages from cracked pipeline and created jet fires were simulated as continuous ignition, three dimensional, unsteady and turbulent cases. Numerical simulations were based on finite volume method and turbulence of flow was considered using k-ω SST model. Also, the combustion of natural gas and air mixture was applied using the eddy dissipation method. The results show that, due to the high pressure difference between pipeline and environment, flow chocks in the cracked area and velocity of the exhausted gas reaches to sound speed. Also, analysis of the incident radiation results shows that safety distances around 42 inches high pressure natural gas pipeline based on 5 and 15 kW/m2 criteria are 205 and 272 meters, respectively.

Keywords: Gas pipelines, incident radiation, numerical simulation, safety distance.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1067
263 Potential Climate Change Impacts on the Hydrological System of the Harvey River Catchment

Authors: Hashim Isam Jameel Al-Safi, P. Ranjan Sarukkalige

Abstract:

Climate change is likely to impact the Australian continent by changing the trends of rainfall, increasing temperature, and affecting the accessibility of water quantity and quality. This study investigates the possible impacts of future climate change on the hydrological system of the Harvey River catchment in Western Australia by using the conceptual modelling approach (HBV mode). Daily observations of rainfall and temperature and the long-term monthly mean potential evapotranspiration, from six weather stations, were available for the period (1961-2015). The observed streamflow data at Clifton Park gauging station for 33 years (1983-2015) in line with the observed climate variables were used to run, calibrate and validate the HBV-model prior to the simulation process. The calibrated model was then forced with the downscaled future climate signals from a multi-model ensemble of fifteen GCMs of the CMIP3 model under three emission scenarios (A2, A1B and B1) to simulate the future runoff at the catchment outlet. Two periods were selected to represent the future climate conditions including the mid (2046-2065) and late (2080-2099) of the 21st century. A control run, with the reference climate period (1981-2000), was used to represent the current climate status. The modelling outcomes show an evident reduction in the mean annual streamflow during the mid of this century particularly for the A1B scenario relative to the control run. Toward the end of the century, all scenarios show a relatively high reduction trends in the mean annual streamflow, especially the A1B scenario, compared to the control run. The decline in the mean annual streamflow ranged between 4-15% during the mid of the current century and 9-42% by the end of the century.

Keywords: Climate change impact, Harvey catchment, HBV model, hydrological modelling, GCMs, LARS-WG, Australia.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1405
262 Fuzzy Mathematical Morphology approach in Image Processing

Authors: Yee Yee Htun, Dr. Khaing Khaing Aye

Abstract:

Morphological operators transform the original image into another image through the interaction with the other image of certain shape and size which is known as the structure element. Mathematical morphology provides a systematic approach to analyze the geometric characteristics of signals or images, and has been applied widely too many applications such as edge detection, objection segmentation, noise suppression and so on. Fuzzy Mathematical Morphology aims to extend the binary morphological operators to grey-level images. In order to define the basic morphological operations such as fuzzy erosion, dilation, opening and closing, a general method based upon fuzzy implication and inclusion grade operators is introduced. The fuzzy morphological operations extend the ordinary morphological operations by using fuzzy sets where for fuzzy sets, the union operation is replaced by a maximum operation, and the intersection operation is replaced by a minimum operation. In this work, it consists of two articles. In the first one, fuzzy set theory, fuzzy Mathematical morphology which is based on fuzzy logic and fuzzy set theory; fuzzy Mathematical operations and their properties will be studied in details. As a second part, the application of fuzziness in Mathematical morphology in practical work such as image processing will be discussed with the illustration problems.

Keywords: Binary Morphological, Fuzzy sets, Grayscalemorphology, Image processing, Mathematical morphology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3219
261 Statistical Distributions of the Lapped Transform Coefficients for Images

Authors: Vijay Kumar Nath, Deepika Hazarika, Anil Mahanta,

Abstract:

Discrete Cosine Transform (DCT) based transform coding is very popular in image, video and speech compression due to its good energy compaction and decorrelating properties. However, at low bit rates, the reconstructed images generally suffer from visually annoying blocking artifacts as a result of coarse quantization. Lapped transform was proposed as an alternative to the DCT with reduced blocking artifacts and increased coding gain. Lapped transforms are popular for their good performance, robustness against oversmoothing and availability of fast implementation algorithms. However, there is no proper study reported in the literature regarding the statistical distributions of block Lapped Orthogonal Transform (LOT) and Lapped Biorthogonal Transform (LBT) coefficients. This study performs two goodness-of-fit tests, the Kolmogorov-Smirnov (KS) test and the 2- test, to determine the distribution that best fits the LOT and LBT coefficients. The experimental results show that the distribution of a majority of the significant AC coefficients can be modeled by the Generalized Gaussian distribution. The knowledge of the statistical distribution of transform coefficients greatly helps in the design of optimal quantizers that may lead to minimum distortion and hence achieve optimal coding efficiency.

Keywords: Lapped orthogonal transform, Lapped biorthogonal transform, Image compression, KS test,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1565
260 Optimizing Materials Cost and Mechanical Properties of PVC Electrical Cable-s Insulation by Using Mixture Experimental Design Approach

Authors: Safwan Altarazi, Raghad Hemeimat, Mousa Wakileh, Ra'ad Qsous, Aya Khreisat

Abstract:

With the development of the Polyvinyl chloride (PVC) products in many applications, the challenge of investigating the raw material composition and reducing the cost have both become more and more important. Considerable research has been done investigating the effect of additives on the PVC products. Most of the PVC composites research investigates only the effect of single/few factors, at a time. This isolated consideration of the input factors does not take in consideration the interaction effect of the different factors. This paper implements a mixture experimental design approach to find out a cost-effective PVC composition for the production of electrical-insulation cables considering the ASTM Designation (D) 6096. The results analysis showed that a minimum cost can be achieved through using 20% virgin PVC, 18.75% recycled PVC, 43.75% CaCO3 with participle size 10 microns, 14% DOP plasticizer, and 3.5% CPW plasticizer. For maximum UTS the compound should consist of: 17.5% DOP, 62.5% virgin PVC, and 20.0% CaCO3 of particle size 5 microns. Finally, for the highest ductility the compound should be made of 35% virgin PVC, 20% CaCO3 of particle size 5 microns, and 45.0% DOP plasticizer.

Keywords: ASTM 6096, mixture experimental-design approach, PVC electrical cable insulation, recycled PVC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4677
259 Incessant Collapse of Buildings in Nigeria: The Possible Role of the Use of Inappropriate Cement Grade/Strength Class

Authors: Kazeem K. Adewole, Joy-Felicia O. Oladejo, Wasiu O. Ajagbe

Abstract:

The use of low quality concrete has been identified as one of the main causes of the incessant collapse of buildings in Nigeria. Emphasis has been on the use of poor quality aggregates, poor workmanship and the use of lean concrete mix with low cement quantity as the reasons for the low quality of concrete used for building construction in Nigeria. Surveys conducted revealed that in the construction of most privately owned buildings where concrete trial mixes and concrete compressive strength quality assurance tests are not conducted, concretes used for building constructions are produced using the 1:2:4 mix ratio irrespective of the cement grade/strength class. In this paper, the possible role of the use of inappropriate cement grade/strength class as a cause of the incessant collapse of building in Nigeria is investigated. Investigation revealed that the compressive strengths of concrete cubes produced with Portland-limestone cement grade 32.5 using 1:2:4 and 1:1.5:3 mix ratios are less than the 25MPa and 30MPa cube strengths generally recommended for building superstructures and foundations respectively. Conversely, the compressive strengths of concrete cubes produced with Portland-limestone cement grade 42.5 using 1:2:4 and 1:1.5:3 mix ratios exceed the 25MPa and 30MPa generally recommended for building superstructures and foundations respectively. Thus, it can be concluded that the use of inappropriate cement grade (Portland-limestone cement grade 32.5), particularly for the construction of building foundations is a potential cause of the incessant collapse of buildings in Nigeria. It is recommended that the Standards Organisation of Nigeria should embark on creating awareness for Nigerians, particularly, the home owners and the roadside craftsmen that Portland-limestone cement grade 32.5 should not be used for the construction of building load-carrying members, particularly, building foundations in order to reduce the incessant incidence of collapsed building.

Keywords: Cement grades, Concrete strength class, Collapsed building, Concrete mix ratio, Portland-limestone cement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3658
258 Effective Defect Prevention Approach in Software Process for Achieving Better Quality Levels

Authors: Suma. V., T. R. Gopalakrishnan Nair

Abstract:

Defect prevention is the most vital but habitually neglected facet of software quality assurance in any project. If functional at all stages of software development, it can condense the time, overheads and wherewithal entailed to engineer a high quality product. The key challenge of an IT industry is to engineer a software product with minimum post deployment defects. This effort is an analysis based on data obtained for five selected projects from leading software companies of varying software production competence. The main aim of this paper is to provide information on various methods and practices supporting defect detection and prevention leading to thriving software generation. The defect prevention technique unearths 99% of defects. Inspection is found to be an essential technique in generating ideal software generation in factories through enhanced methodologies of abetted and unaided inspection schedules. On an average 13 % to 15% of inspection and 25% - 30% of testing out of whole project effort time is required for 99% - 99.75% of defect elimination. A comparison of the end results for the five selected projects between the companies is also brought about throwing light on the possibility of a particular company to position itself with an appropriate complementary ratio of inspection testing.

Keywords: Defect Detection and Prevention, Inspections, Software Engineering, Software Process, Testing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1501
257 Impulse Response Shortening for Discrete Multitone Transceivers using Convex Optimization Approach

Authors: Ejaz Khan, Conor Heneghan

Abstract:

In this paper we propose a new criterion for solving the problem of channel shortening in multi-carrier systems. In a discrete multitone receiver, a time-domain equalizer (TEQ) reduces intersymbol interference (ISI) by shortening the effective duration of the channel impulse response. Minimum mean square error (MMSE) method for TEQ does not give satisfactory results. In [1] a new criterion for partially equalizing severe ISI channels to reduce the cyclic prefix overhead of the discrete multitone transceiver (DMT), assuming a fixed transmission bandwidth, is introduced. Due to specific constrained (unit morm constraint on the target impulse response (TIR)) in their method, the freedom to choose optimum vector (TIR) is reduced. Better results can be obtained by avoiding the unit norm constraint on the target impulse response (TIR). In this paper we change the cost function proposed in [1] to the cost function of determining the maximum of a determinant subject to linear matrix inequality (LMI) and quadratic constraint and solve the resulting optimization problem. Usefulness of the proposed method is shown with the help of simulations.

Keywords: Equalizer, target impulse response, convex optimization, matrix inequality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1673
256 Passive Flow Control in Twin Air-Intakes

Authors: Akshoy R. Paul, Pritanshu Ranjan, Ravi R. Upadhyay, Anuj Jain

Abstract:

Aircraft propulsion systems often use Y-shaped subsonic diffusing ducts as twin air-intakes to supply the ambient air into the engine compressor for thrust generation. Due to space constraint, the diffusers need to be curved, which causes severe flow non-uniformity at the engine face. The present study attempt to control flow in a mild-curved Y-duct diffuser using trapezoidalshaped vortex generators (VG) attached on either both the sidewalls or top and bottom walls of the diffuser at the inflexion plane. A commercial computational fluid dynamics (CFD) code is modified and is used to simulate the effects of SVG in flow of a Y-duct diffuser. A few experiments are conducted for CFD code validation, while the rest are done computationally. The best combination of Yduct diffuser is found with VG-2 arranged in co-rotating sequence and attached to both the sidewalls, which ensures highest static pressure recovery, lowest total pressure loss, minimum flow distortion and less flow separation in Y-duct diffuser. The decrease in VG height while attached to top and bottom walls further improves axial flow uniformity at the diffuser outlet by a great margin as compared to the bare duct.

Keywords: Twin air-intake, Vortex generator (VG), Turbulence model, Pressure recovery, Distortion coefficient

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2096
255 Economic Evaluation of Bowland Shale Gas Wells Development in the UK

Authors: Elijah Acquah-Andoh

Abstract:

The UK has had its fair share of the shale gas revolutionary waves blowing across the global oil and gas industry at present. Although, its exploitation is widely agreed to have been delayed, shale gas was looked upon favorably by the UK Parliament when they recognized it as genuine energy source and granted licenses to industry to search and extract the resource. This, although a significant progress by industry, there yet remains another test the UK fracking resource must pass in order to render shale gas extraction feasible – it must be economically extractible and sustainably so. Developing unconventional resources is much more expensive and risky, and for shale gas wells, producing in commercial volumes is conditional upon drilling horizontal wells and hydraulic fracturing, techniques which increase CAPEX. Meanwhile, investment in shale gas development projects is sensitive to gas price and technical and geological risks. Using a Two-Factor Model, the economics of the Bowland shale wells were analyzed and the operational conditions under which fracking is profitable in the UK was characterized. We find that there is a great degree of flexibility about Opex spending; hence Opex does not pose much threat to the fracking industry in the UK. However, we discover Bowland shale gas wells fail to add value at gas price of $8/ Mmbtu. A minimum gas price of $12/Mmbtu at Opex of no more than $2/ Mcf and no more than $14.95M Capex are required to create value within the present petroleum tax regime, in the UK fracking industry.

Keywords: Capex, economical, investment, profitability, shale gas development, sustainable.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2680
254 Apoptosis Pathway Targeted by Thymoquinone in MCF7 Breast Cancer Cell Line

Authors: M. Marjaneh, M. Y. Narazah, H. Shahrul

Abstract:

Array-based gene expression analysis is a powerful tool to profile expression of genes and to generate information on therapeutic effects of new anti-cancer compounds. Anti-apoptotic effect of thymoquinone was studied in MCF7 breast cancer cell line using gene expression profiling with cDNA microarray. The purity and yield of RNA samples were determined using RNeasyPlus Mini kit. The Agilent RNA 6000 NanoLabChip kit evaluated the quantity of the RNA samples. AffinityScript RT oligo-dT promoter primer was used to generate cDNA strands. T7 RNA polymerase was used to convert cDNA to cRNA. The cRNA samples and human universal reference RNA were labelled with Cy-3-CTP and Cy-5-CTP, respectively. Feature Extraction and GeneSpring softwares analysed the data. The single experiment analysis revealed involvement of 64 pathways with up-regulated genes and 78 pathways with downregulated genes. The MAPK and p38-MAPK pathways were inhibited due to the up-regulation of PTPRR gene. The inhibition of p38-MAPK suggested up-regulation of TGF-ß pathway. Inhibition of p38-MAPK caused up-regulation of TP53 and down-regulation of Bcl2 genes indicating involvement of intrinsic apoptotic pathway. Down-regulation of CARD16 gene as an adaptor molecule regulated CASP1 and suggested necrosis-like programmed cell death and involvement of caspase in apoptosis. Furthermore, down-regulation of GPCR, EGF-EGFR signalling pathways suggested reduction of ER. Involvement of AhR pathway which control cytochrome P450 and glucuronidation pathways showed metabolism of Thymoquinone. The findings showed differential expression of several genes in apoptosis pathways with thymoquinone treatment in estrogen receptor-positive breast cancer cells.

Keywords: CARD16, CASP10, cDNA microarray, PTPRR, Thymoquinone.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2247
253 Rheological Properties of Dough and Sensory Quality of Crackers with Dietary Fibers

Authors: Ljubica Dokić, Ivana Nikolić, Dragana Šoronja–Simović, Zita Šereš, Biljana Pajin, Nils Juul, Nikola Maravić

Abstract:

The possibility of application the dietary fibers in production of crackers was observed in this work, as well as their influence on rheological and textural properties on the dough for crackers and influence on sensory properties of obtained crackers. Three different dietary fibers, oat, potato and pea fibers, replaced 10% of wheat flour. Long fermentation process and baking test method were used for crackers production. The changes of dough for crackers were observed by rheological methods of determination the viscoelastic dough properties and by textural measurements. Sensory quality of obtained crackers was described using quantity descriptive method (QDA) by trained members of descriptive panel. Additional analysis of crackers surface was performed by videometer. Based on rheological determination, viscoelastic properties of dough for crackers were reduced by application of dietary fibers. Manipulation of dough with 10% of potato fiber was disabled, thus the recipe modification included increase in water content at 35%. Dough compliance to constant stress for samples with dietary fibers decreased, due to more rigid and stiffer dough consistency compared to control sample. Also, hardness of dough for these samples increased and dough extensibility decreased. Sensory properties of final products, crackers, were reduced compared to control sample. Application of dietary fibers affected mostly hardness, structure and crispness of the crackers. Observed crackers were low marked for flavor and taste, due to influence of fibers specific aroma. The sample with 10% of potato fibers and increased water content was the most adaptable to applied stresses and to production process. Also this sample was close to control sample without dietary fibers by evaluation of sensory properties and by results of videometer method.

Keywords: Crackers, dietary fibers, rheology, sensory properties.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2457
252 Non-Singular Gravitational Collapse of a Homogeneous Scalar Field in Deformed Phase Space

Authors: Amir Hadi Ziaie

Abstract:

In the present work, we revisit the collapse process of a spherically symmetric homogeneous scalar field (in FRW background) minimally coupled to gravity, when the phase-space deformations are taken into account. Such a deformation is mathematically introduced as a particular type of noncommutativity between the canonical momenta of the scale factor and of the scalar field. In the absence of such deformation, the collapse culminates in a spacetime singularity. However, when the phase-space is deformed, we find that the singularity is removed by a non-singular bounce, beyond which the collapsing cloud re-expands to infinity. More precisely, for negative values of the deformation parameter, we identify the appearance of a negative pressure, which decelerates the collapse to finally avoid the singularity formation. While in the un-deformed case, the horizon curve monotonically decreases to finally cover the singularity, in the deformed case the horizon has a minimum value that this value depends on deformation parameter and initial configuration of the collapse. Such a setting predicts a threshold mass for black hole formation in stellar collapse and manifests the role of non-commutative geometry in physics and especially in stellar collapse and supernova explosion.

Keywords: Gravitational collapse, non-commutative geometry, spacetime singularity, black hole physics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1382
251 A Speeded up Robust Scale-Invariant Feature Transform Currency Recognition Algorithm

Authors: Daliyah S. Aljutaili, Redna A. Almutlaq, Suha A. Alharbi, Dina M. Ibrahim

Abstract:

All currencies around the world look very different from each other. For instance, the size, color, and pattern of the paper are different. With the development of modern banking services, automatic methods for paper currency recognition become important in many applications like vending machines. One of the currency recognition architecture’s phases is Feature detection and description. There are many algorithms that are used for this phase, but they still have some disadvantages. This paper proposes a feature detection algorithm, which merges the advantages given in the current SIFT and SURF algorithms, which we call, Speeded up Robust Scale-Invariant Feature Transform (SR-SIFT) algorithm. Our proposed SR-SIFT algorithm overcomes the problems of both the SIFT and SURF algorithms. The proposed algorithm aims to speed up the SIFT feature detection algorithm and keep it robust. Simulation results demonstrate that the proposed SR-SIFT algorithm decreases the average response time, especially in small and minimum number of best key points, increases the distribution of the number of best key points on the surface of the currency. Furthermore, the proposed algorithm increases the accuracy of the true best point distribution inside the currency edge than the other two algorithms.

Keywords: Currency recognition, feature detection and description, SIFT algorithm, SURF algorithm, speeded up and robust features.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 827
250 Dynamic Routing to Multiple Destinations in IP Networks using Hybrid Genetic Algorithm (DRHGA)

Authors: K. Vijayalakshmi, S. Radhakrishnan

Abstract:

In this paper we have proposed a novel dynamic least cost multicast routing protocol using hybrid genetic algorithm for IP networks. Our protocol finds the multicast tree with minimum cost subject to delay, degree, and bandwidth constraints. The proposed protocol has the following features: i. Heuristic local search function has been devised and embedded with normal genetic operation to increase the speed and to get the optimized tree, ii. It is efficient to handle the dynamic situation arises due to either change in the multicast group membership or node / link failure, iii. Two different crossover and mutation probabilities have been used for maintaining the diversity of solution and quick convergence. The simulation results have shown that our proposed protocol generates dynamic multicast tree with lower cost. Results have also shown that the proposed algorithm has better convergence rate, better dynamic request success rate and less execution time than other existing algorithms. Effects of degree and delay constraints have also been analyzed for the multicast tree interns of search success rate.

Keywords: Dynamic Group membership change, Hybrid Genetic Algorithm, Link / node failure, QoS Parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1418
249 Influence of Improved Roughage Quality and Period of Meal Termination on Digesta Load in the Digestive Organs of Goats

Authors: Rasheed A. Adebayo, Mehluli M. Moyo, Ignatius V. Nsahlai

Abstract:

Ruminants are known to relish roughage for productivity but the effect of its quality on digesta load in rumen, omasum, abomasum and other distal organs of the digestive tract is yet unknown. Reticulorumen fill is a strong indicator for long-term control of intake in ruminants. As such, the measurement and prediction of digesta load in these compartments may be crucial to productivity in the ruminant industry. The current study aimed at determining the effect of (a) diet quality on digesta load in digestive organs of goats, and (b) period of meal termination on the reticulorumen fill and digesta load in other distal compartments of the digestive tract of goats. Goats were fed with urea-treated hay (UTH), urea-sprayed hay (USH) and non-treated hay (NTH). At the end of eight weeks of a feeding trial period, upon termination of a meal in the morning, afternoon or evening, all goats were slaughtered in random groups of three per day to measure reticulorumen fill and digesta loads in other distal compartments of the digestive tract. Both diet quality and period affected (P < 0.05) the measure of reticulorumen fill. However, reticulorumen fill in the evening was larger (P < 0.05) than afternoon, while afternoon was similar (P > 0.05) to morning. Also, diet quality affected (P < 0.05) the wet omasal digesta load, wet abomasum, dry abomasum and dry caecum digesta loads but did not affect (P > 0.05) both wet and dry digesta loads in other compartments of the digestive tract. Period of measurement did not affect (P > 0.05) the wet omasal digesta load, and both wet and dry digesta loads in other compartments of the digestive tract except wet abomasum digesta load (P < 0.05) and dry caecum digesta load (P < 0.05). Both wet and dry reticulorumen fill were correlated (P < 0.05) with omasum (r = 0.623) and (r = 0.723), respectively. In conclusion, reticulorumen fill of goats decreased by improving the roughage quality; and the period of meal termination and measurement of the fill is a key factor to the quantity of digesta load.

Keywords: Digesta, goats, meal termination, reticulorumen fill.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774
248 Enhanced Multi-Intensity Analysis in Multi-Scenery Classification-Based Macro and Micro Elements

Authors: R. Bremananth

Abstract:

Several computationally challenging issues are encountered while classifying complex natural scenes. In this paper, we address the problems that are encountered in rotation invariance with multi-intensity analysis for multi-scene overlapping. In the present literature, various algorithms proposed techniques for multi-intensity analysis, but there are several restrictions in these algorithms while deploying them in multi-scene overlapping classifications. In order to resolve the problem of multi-scenery overlapping classifications, we present a framework that is based on macro and micro basis functions. This algorithm conquers the minimum classification false alarm while pigeonholing multi-scene overlapping. Furthermore, a quadrangle multi-intensity decay is invoked. Several parameters are utilized to analyze invariance for multi-scenery classifications such as rotation, classification, correlation, contrast, homogeneity, and energy. Benchmark datasets were collected for complex natural scenes and experimented for the framework. The results depict that the framework achieves a significant improvement on gray-level matrix of co-occurrence features for overlapping in diverse degree of orientations while pigeonholing multi-scene overlapping.

Keywords: Automatic classification, contrast, homogeneity, invariant analysis, multi-scene analysis, overlapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1085
247 Bipolar Square Wave Pulses for Liquid Food Sterilization using Cascaded H-Bridge Multilevel Inverter

Authors: Hanifah Jambari, Naziha A. Azli, M. Afendi M. Piah

Abstract:

This paper presents the generation of bipolar square wave pulses with characteristics that are suitable for liquid food sterilization using a Cascaded H-bridge Multilevel Inverter (CHMI). Bipolar square waves pulses have been reported as stable for a longer time during the sterilization process with minimum heat emission and increased efficiency. The CHMI allows the system to produce bipolar square wave pulses and yielding high output voltage without using a transformer while fulfilling the pulse requirements for effective liquid food sterilization. This in turn can reduce power consumption and cost of the overall liquid food sterilization system. The simulation results have shown that pulses with peak output voltage of 2.4 kV, pulse width of between 1 2s and 1 ms at frequencies of 50 Hz and 100 Hz can be generated by a 7-level CHMI. Results from the experimental set-up based on a 5-level CHMI has indicated the potential of the proposed circuit in producing bipolar square wave output pulses with peak values that depends on the DC source level supplied to the CHMI modules, pulse width of between 12.5 2s and 1 ms at frequencies of 50 Hz and 100 Hz.

Keywords: pulsed electric field, multilevel inverter, bipolarsquare wave, food sterilization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2520
246 Power Flow Tracing Based Reactive Power Ancillary Service (AS) in Restructured Power Market

Authors: M. Susithra, R. Gnanadass

Abstract:

Ancillary services are support services which are essential for humanizing and enhancing the reliability and security of the electric power system. Reactive power ancillary service is one of the important ancillary services in a restructured electricity market which determines the cost of supplying ancillary services and finding of how this cost would change with respect to operating decisions. This paper presents a new formation that can be used to minimize the Independent System Operator (ISO)’s total payment for reactive power ancillary service. The modified power flow tracing algorithm estimates the availability of reserve reactive power for ancillary service. In order to find optimum reactive power dispatch, Biogeography based optimization method (BPO) is proposed. Market Reactive Clearing Price (MRCP) is then estimated and it encourages generator companies (GENCOs) to participate in an ancillary service. Finally, optimal weighting factor and real time utilization factor of reactive power give the minimum ISO’s total payment. The effectiveness of proposed design is verified using IEEE 30 bus system.

Keywords: Biogeography based optimization method, Power flow tracing method, Reactive generation capability curve and Reactive power ancillary service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3197
245 Activity Recognition by Smartphone Accelerometer Data Using Ensemble Learning Methods

Authors: Eu Tteum Ha, Kwang Ryel Ryu

Abstract:

As smartphones are equipped with various sensors, there have been many studies focused on using these sensors to create valuable applications. Human activity recognition is one such application motivated by various welfare applications, such as the support for the elderly, measurement of calorie consumption, lifestyle and exercise patterns analyses, and so on. One of the challenges one faces when using smartphone sensors for activity recognition is that the number of sensors should be minimized to save battery power. In this paper, we show that a fairly accurate classifier can be built that can distinguish ten different activities by using only a single sensor data, i.e., the smartphone accelerometer data. The approach that we adopt to deal with this twelve-class problem uses various methods. The features used for classifying these activities include not only the magnitude of acceleration vector at each time point, but also the maximum, the minimum, and the standard deviation of vector magnitude within a time window. The experiments compared the performance of four kinds of basic multi-class classifiers and the performance of four kinds of ensemble learning methods based on three kinds of basic multi-class classifiers. The results show that while the method with the highest accuracy is ECOC based on Random forest.

Keywords: Ensemble learning, activity recognition, smartphone accelerometer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2134
244 Effect of Environmental Parameters on the Water Solubility of the Polycyclic Aromatic Hydrocarbons and Derivatives Using Taguchi Experimental Design Methodology

Authors: P. Pimsee, C. Sablayrolles, P. de Caro, J. Guyomarch, N. Lesage, M. Montréjaud-Vignoles

Abstract:

The MIGR’HYCAR research project was initiated to provide decisional tools for risks connected to oil spill drifts in continental waters. These tools aim to serve in the decision-making process once oil spill pollution occurs and/or as reference tools to study scenarios of potential impacts of pollutions on a given site. This paper focuses on the study of the distribution of polycyclic aromatic hydrocarbons (PAHs) and derivatives from oil spill in water as function of environmental parameters. Eight petroleum oils covering a representative range of commercially available products were tested. 41 polycyclic aromatic hydrocarbons (PAHs) and derivates, among them 16 EPA priority pollutants were studied by dynamic tests at laboratory scale. The chemical profile of the water soluble fraction was different from the parent oil profile due to the various water solubility of oil components. Semi-volatile compounds (naphtalenes) constitute the major part of the water soluble fraction. A large variation in composition of the water soluble fraction was highlighted depending on oil type. Moreover, four environmental parameters (temperature, suspended solid quantity, salinity and oil: water surface ratio) were investigated with the Taguchi experimental design methodology. The results showed that oils are divided into three groups: the solubility of Domestic fuel and Jet A1 presented a high sensitivity to parameters studied, meaning they must be taken into account. For Gasoline (SP95-E10) and Diesel fuel, a medium sensitivity to parameters was observed. In fact, the four others oils have shown low sensitivity to parameters studied. Finally, three parameters were found to be significant towards the water soluble fraction.

Keywords: Monitoring, PAHs, SBSE, water soluble fraction, Taguchi experimental design.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1946
243 Comparative Study of Evolutionary Model and Clustering Methods in Circuit Partitioning Pertaining to VLSI Design

Authors: K. A. Sumitra Devi, N. P. Banashree, Annamma Abraham

Abstract:

Partitioning is a critical area of VLSI CAD. In order to build complex digital logic circuits its often essential to sub-divide multi -million transistor design into manageable Pieces. This paper looks at the various partitioning techniques aspects of VLSI CAD, targeted at various applications. We proposed an evolutionary time-series model and a statistical glitch prediction system using a neural network with selection of global feature by making use of clustering method model, for partitioning a circuit. For evolutionary time-series model, we made use of genetic, memetic & neuro-memetic techniques. Our work focused in use of clustering methods - K-means & EM methodology. A comparative study is provided for all techniques to solve the problem of circuit partitioning pertaining to VLSI design. The performance of all approaches is compared using benchmark data provided by MCNC standard cell placement benchmark net lists. Analysis of the investigational results proved that the Neuro-memetic model achieves greater performance then other model in recognizing sub-circuits with minimum amount of interconnections between them.

Keywords: VLSI, circuit partitioning, memetic algorithm, genetic algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1608
242 Cyber Security Situational Awareness among Students: A Case Study in Malaysia

Authors: Yunos Zahri, Ab Hamid R. Susanty, Ahmad Mustaffa

Abstract:

This paper explores the need for a national baseline study on understanding the level of cyber security situational awareness among primary and secondary school students in Malaysia. The online survey method was deployed to administer the data collection exercise. The target groups were divided into three categories: Group 1 (primary school aged 7-9 years old), Group 2 (primary school aged 10-12 years old), and Group 3 (secondary school aged 13-17 years old). A different questionnaire set was designed for each group. The survey topics/areas included Internet and digital citizenship knowledge. Respondents were randomly selected from rural and urban areas throughout all 14 states in Malaysia. A total of 9,158 respondents participated in the survey, with most states meeting the minimum sample size requirement to represent the country’s demographics. The findings and recommendations from this baseline study are fundamental to develop teaching modules required for children to understand the security risks and threats associated with the Internet throughout their years in school. Early exposure and education will help ensure healthy cyber habits among millennials in Malaysia.

Keywords: Cyber security awareness, cyber security education, cyber security, students.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2933