Search results for: wireless performance analysis
30042 Optimizing the Performance of Thermoelectric for Cooling Computer Chips Using Different Types of Electrical Pulses
Authors: Saleh Alshehri
Abstract:
Thermoelectric technology is currently being used in many industrial applications for cooling, heating and generating electricity. This research mainly focuses on using thermoelectric to cool down high-speed computer chips at different operating conditions. A previously developed and validated three-dimensional model for optimizing and assessing the performance of cascaded thermoelectric and non-cascaded thermoelectric is used in this study to investigate the possibility of decreasing the hotspot temperature of computer chip. Additionally, a test assembly is built and tested at steady-state and transient conditions. The obtained optimum thermoelectric current at steady-state condition is used to conduct a number of pulsed tests (i.e. transient tests) with different shapes to cool the computer chips hotspots. The results of the steady-state tests showed that at hotspot heat rate of 15.58 W (5.97 W/cm2), using thermoelectric current of 4.5 A has resulted in decreasing the hotspot temperature at open circuit condition (89.3 °C) by 50.1 °C. Maximum and minimum hotspot temperatures have been affected by ON and OFF duration of the electrical current pulse. Maximum hotspot temperature was resulted by longer OFF pulse period. In addition, longer ON pulse period has generated the minimum hotspot temperature.Keywords: thermoelectric generator, TEG, thermoelectric cooler, TEC, chip hotspots, electronic cooling
Procedia PDF Downloads 14730041 A Development of Portable Intrinsically Safe Explosion-Proof Type of Dual Gas Detector
Authors: Sangguk Ahn, Youngyu Kim, Jaheon Gu, Gyoutae Park
Abstract:
In this paper, we developed a dual gas leak instrument to detect Hydrocarbon (HC) and Monoxide (CO) gases. To two kinds of gases, it is necessary to design compact structure for sensors. And then it is important to draw sensing circuits such as measuring, amplifying and filtering. After that, it should be well programmed with robust, systematic and module coding methods. In center of them, improvement of accuracy and initial response time are a matter of vital importance. To manufacture distinguished gas leak detector, we applied intrinsically safe explosion-proof structure to lithium ion battery, main circuits, a pump with motor, color LCD interfaces and sensing circuits. On software, to enhance measuring accuracy we used numerical analysis such as Lagrange and Neville interpolation. Performance test result is conducted by using standard Methane with seven different concentrations with three other products. We want raise risk prevention and efficiency of gas safe management through distributing to the field of gas safety. Acknowledgment: This study was supported by Small and Medium Business Administration under the research theme of ‘Commercialized Development of a portable intrinsically safe explosion-proof type dual gas leak detector’, (task number S2456036).Keywords: gas leak, dual gas detector, intrinsically safe, explosion proof
Procedia PDF Downloads 23130040 Coupling Random Demand and Route Selection in the Transportation Network Design Problem
Authors: Shabnam Najafi, Metin Turkay
Abstract:
Network design problem (NDP) is used to determine the set of optimal values for certain pre-specified decision variables such as capacity expansion of nodes and links by optimizing various system performance measures including safety, congestion, and accessibility. The designed transportation network should improve objective functions defined for the system by considering the route choice behaviors of network users at the same time. The NDP studies mostly investigated the random demand and route selection constraints separately due to computational challenges. In this work, we consider both random demand and route selection constraints simultaneously. This work presents a nonlinear stochastic model for land use and road network design problem to address the development of different functional zones in urban areas by considering both cost function and air pollution. This model minimizes cost function and air pollution simultaneously with random demand and stochastic route selection constraint that aims to optimize network performance via road capacity expansion. The Bureau of Public Roads (BPR) link impedance function is used to determine the travel time function in each link. We consider a city with origin and destination nodes which can be residential or employment or both. There are set of existing paths between origin-destination (O-D) pairs. Case of increasing employed population is analyzed to determine amount of roads and origin zones simultaneously. Minimizing travel and expansion cost of routes and origin zones in one side and minimizing CO emission in the other side is considered in this analysis at the same time. In this work demand between O-D pairs is random and also the network flow pattern is subject to stochastic user equilibrium, specifically logit route choice model. Considering both demand and route choice, random is more applicable to design urban network programs. Epsilon-constraint is one of the methods to solve both linear and nonlinear multi-objective problems. In this work epsilon-constraint method is used to solve the problem. The problem was solved by keeping first objective (cost function) as the objective function of the problem and second objective as a constraint that should be less than an epsilon, where epsilon is an upper bound of the emission function. The value of epsilon should change from the worst to the best value of the emission function to generate the family of solutions representing Pareto set. A numerical example with 2 origin zones and 2 destination zones and 7 links is solved by GAMS and the set of Pareto points is obtained. There are 15 efficient solutions. According to these solutions as cost function value increases, emission function value decreases and vice versa.Keywords: epsilon-constraint, multi-objective, network design, stochastic
Procedia PDF Downloads 65130039 Well Inventory Data Entry: Utilization of Developed Technologies to Progress the Integrated Asset Plan
Authors: Danah Al-Selahi, Sulaiman Al-Ghunaim, Bashayer Sadiq, Fatma Al-Otaibi, Ali Ameen
Abstract:
In light of recent changes affecting the Oil & Gas Industry, optimization measures have become imperative for all companies globally, including Kuwait Oil Company (KOC). To keep abreast of the dynamic market, a detailed Integrated Asset Plan (IAP) was developed to drive optimization across the organization, which was facilitated through the in-house developed software “Well Inventory Data Entry” (WIDE). This comprehensive and integrated approach enabled centralization of all planned asset components for better well planning, enhancement of performance, and to facilitate continuous improvement through performance tracking and midterm forecasting. Traditionally, this was hard to achieve as, in the past, various legacy methods were used. This paper briefly describes the methods successfully adopted to meet the company’s objective. IAPs were initially designed using computerized spreadsheets. However, as data captured became more complex and the number of stakeholders requiring and updating this information grew, the need to automate the conventional spreadsheets became apparent. WIDE, existing in other aspects of the company (namely, the Workover Optimization project), was utilized to meet the dynamic requirements of the IAP cycle. With the growth of extensive features to enhance the planning process, the tool evolved into a centralized data-hub for all asset-groups and technical support functions to analyze and infer from, leading WIDE to become the reference two-year operational plan for the entire company. To achieve WIDE’s goal of operational efficiency, asset-groups continuously add their parameters in a series of predefined workflows that enable the creation of a structured process which allows risk factors to be flagged and helps mitigation of the same. This tool dictates assigned responsibilities for all stakeholders in a method that enables continuous updates for daily performance measures and operational use. The reliable availability of WIDE, combined with its user-friendliness and easy accessibility, created a platform of cross-functionality amongst all asset-groups and technical support groups to update contents of their respective planning parameters. The home-grown entity was implemented across the entire company and tailored to feed in internal processes of several stakeholders across the company. Furthermore, the implementation of change management and root cause analysis techniques captured the dysfunctionality of previous plans, which in turn resulted in the improvement of already existing mechanisms of planning within the IAP. The detailed elucidation of the 2 year plan flagged any upcoming risks and shortfalls foreseen in the plan. All results were translated into a series of developments that propelled the tool’s capabilities beyond planning and into operations (such as Asset Production Forecasts, setting KPIs, and estimating operational needs). This process exemplifies the ability and reach of applying advanced development techniques to seamlessly integrated the planning parameters of various assets and technical support groups. These techniques enables the enhancement of integrating planning data workflows that ultimately lay the founding plans towards an epoch of accuracy and reliability. As such, benchmarks of establishing a set of standard goals are created to ensure the constant improvement of the efficiency of the entire planning and operational structure.Keywords: automation, integration, value, communication
Procedia PDF Downloads 15030038 Budget and the Performance of Public Enterprises: A Study of Selected Public Enterprises in Nasarawa State Nigeria (2009-2013)
Authors: Dalhatu, Musa Yusha’u, Shuaibu Sidi Safiyanu, Haliru Musa Hussaini
Abstract:
This study examined budget and performance of public enterprises in Nasarawa State, Nigeria in a period of 2009-2013. The study utilized secondary sources of data obtained from four selected parastatals’ budget allocation and revenue generation for the period under review. The simple correlation coefficient was used to analyze the extent of the relationship between budget allocation and revenue generation of the parastatals. Findings revealed varying results. There was positive (0.21) and weak correlation between expenditure and revenue of Nasarawa Investment and Property Development Company (NIPDC). However, the study further revealed that there was strong and weak negative relationship in the revenue and expenditure of the following parastatals over the period under review. Viz: Nasarawa State Water Board, -0.27 (weak), Nasarawa State Broadcasting Service, -0.52 (Strong) and Nasarawa State College of Agriculture, -0.36 (weak). The study therefore, recommends that government should increase its investments in NIPDC to enhance efficiency and profitability. It also recommends that government should strengthen its fiscal responsibility, accountability and transparency in public parastatals.Keywords: budget, public enterprises, revenue, enterprise
Procedia PDF Downloads 26530037 Coordinated Voltage Control in a Radial Distribution System
Authors: Shivarudraswamy, Anubhav Shrivastava, Lakshya Bhat
Abstract:
Distributed generation has indeed become a major area of interest in recent years. Distributed Generation can address large number of loads in a power line and hence has better efficiency over the conventional methods. However there are certain drawbacks associated with it, increase in voltage being the major one. This paper addresses the voltage control at the buses for an IEEE 30 bus system by regulating reactive power. For carrying out the analysis, the suitable location for placing distributed generators (DG) is identified through load flow analysis and seeing where the voltage profile is dipping. MATLAB programming is used to regulate the voltage at all buses within +/-5% of the base value even after the introduction of DG’s. Three methods for regulation of voltage are discussed. A sensitivity based analysis is later carried out to determine the priority among the various methods listed in the paper.Keywords: distributed generators, distributed system, reactive power, voltage control
Procedia PDF Downloads 50230036 An Analysis of Discourse Markers Awareness in Writing Undergraduate Thesis of English Education Student in Sebelas Maret University
Authors: Oktanika Wahyu Nurjanah, Anggun Fitriana Dewi
Abstract:
An undergraduate thesis is one of the academic writings which should fulfill some characteristics, one of them is coherency. Moreover, a coherence of a text depends on the usage of discourse markers. In other word, discourse markers take an essential role in writing. Therefore, the researchers aim to know the awareness of the discourse markers usage in writing the under-graduate thesis of an English Education student at Sebelas Maret University. This research uses a qualitative case study in order to obtain a deep analysis. The sample of this research is an under-graduate thesis of English Education student in Sebelas Maret University which chosen based on some criteria. Additionally, the researchers were guided by some literature attempted to group the discourse markers based on their functions. Afterward, the analysis was held based on it. From the analysis, it found that the awareness of discourse markers usage is moderate. The last point, the researcher suggest undergraduate students to familiarize themselves with discourse markers, especially for those who want to write thesis.Keywords: discourse markers, English education, thesis writing, undergraduate student
Procedia PDF Downloads 36330035 The Role of Reading Self-Efficacy and Perception of Difficulty in English Reading among Chinese ESL Learners
Authors: Kevin Chan, Kevin K. H. Chung, Patcy P. S. Yeung, H. L. Ip, Bill T. C. Chung, Karen M. K. Chung
Abstract:
Purpose: Recent evidence shows that reading self-efficacy and students perceived difficulty in reading are significantly associated with word reading and reading fluency. However, little is known about these relationships among students learning to read English as a second language, particularly in Chinese students. This study examined the contributions of reading self-efficacy, perception of difficulty in reading, and cognitive-linguistic skills to performance on English word reading and reading fluency in Chinese students. Method: A sample of 122 second-and third-grade students in Hong Kong, China, participated in this study. Students completed the measures of reading self-efficacy and perception of difficulty in reading. They were assessed on their English cognitive-linguistic and reading skills: rapid automatized naming, nonword reading, phonological awareness, word reading, and one-minute word reading. Results: Results of path analysis indicated that when students’ grades were controlled, reading self-efficacy was a significant correlate of word reading and reading fluency, whereas perception of difficulty in reading negatively predicted word reading. Conclusion: These findings underscore the importance of taking students’ reading self-efficacy and perception of difficulty in reading and their cognitive-linguistic skills into consideration when designing reading intervention and instructions for students learning English as a second language.Keywords: self-efficacy, perception of difficulty in reading, english as a second language, word reading
Procedia PDF Downloads 19230034 A Semantic and Concise Structure to Represent Human Actions
Authors: Tobias Strübing, Fatemeh Ziaeetabar
Abstract:
Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis
Procedia PDF Downloads 12930033 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control
Authors: Sangwon Han, Chengquan Jin
Abstract:
Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability
Procedia PDF Downloads 19130032 High Frequency Rotary Transformer Used in Synchronous Motor/Generator of Flywheel Energy Storage System
Authors: J. Lu, H. Li, F. Cole
Abstract:
This paper proposes a high-frequency rotary transformer (HFRT) for a separately excited synchronous machine (SESM) used in a flywheel energy storage system. The SESM can eliminate and reduce rare earth permanent magnet (REPM) usage and provide a better performance in renewable energy systems. However, the major drawback of such SESM is the necessity of brushes and slip rings to supply the field current, which increases the maintenance cost and operation reliability. To overcome these problems, an HFRT integrated with SiC semiconductor devices can replace brushes and slip rings in the SESM. The proposed HFRT features a high-frequency magnetic ferrite for both the stationary part as the transformer primary and the rotating part as the transformer secondary, as well as an air gap, allowing safe operation at high rotational speeds. Hence, this rotary transformer can enable the adoption of a wound rotor synchronous machine (WRSM). The HFRT, working at over 100kHz operating frequency, exhibits excellent performance of power efficiency and significant size reduction. The experimental validations to support the theoretical findings have been provided.Keywords: brushes and slip rings, flywheel energy storage, high frequency rotary transformer, separately excited synchronous machine
Procedia PDF Downloads 5130031 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources
Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha
Abstract:
Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models
Procedia PDF Downloads 21630030 Evaluation of Different Liquid Scintillation Counting Methods for 222Rn Determination in Waters
Authors: Jovana Nikolov, Natasa Todorovic, Ivana Stojkovic
Abstract:
Monitoring of 222Rn in drinking or surface waters, as well as in groundwater has been performed in connection with geological, hydrogeological and hydrological surveys and health hazard studies. Liquid scintillation counting (LSC) is often preferred analytical method for 222Rn measurements in waters because it allows multiple-sample automatic analysis. LSC method implies mixing of water samples with organic scintillation cocktail, which triggers radon diffusion from the aqueous into organic phase for which it has a much greater affinity, eliminating possibility of radon emanation in that manner. Two direct LSC methods that assume different sample composition have been presented, optimized and evaluated in this study. One-phase method assumed direct mixing of 10 ml sample with 10 ml of emulsifying cocktail (Ultima Gold AB scintillation cocktail is used). Two-phase method involved usage of water-immiscible cocktails (in this study High Efficiency Mineral Oil Scintillator, Opti-Fluor O and Ultima Gold F are used). Calibration samples were prepared with aqueous 226Ra standard in glass 20 ml vials and counted on ultra-low background spectrometer Quantulus 1220TM equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with 226Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide reliable and precise spectra separation. Consequentially, calibration procedure was done through investigation of PSA discriminator level influence on 222Rn efficiency detection, using 226Ra calibration standard in wide range of activity concentrations. Evaluation of presented methods was based on obtained efficiency detections and achieved Minimal Detectable Activity (MDA). Comparison of presented methods, accuracy and precision as well as different scintillation cocktail’s performance was considered from results of measurements of 226Ra spiked water samples with known activity and environmental samples.Keywords: 222Rn in water, Quantulus1220TM, scintillation cocktail, PSA parameter
Procedia PDF Downloads 20630029 Meteosat Second Generation Image Compression Based on the Radon Transform and Linear Predictive Coding: Comparison and Performance
Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane
Abstract:
Image compression is used to reduce the number of bits required to represent an image. The Meteosat Second Generation satellite (MSG) allows the acquisition of 12 image files every 15 minutes. Which results a large databases sizes. The transform selected in the images compression should contribute to reduce the data representing the images. The Radon transform retrieves the Radon points that represent the sum of the pixels in a given angle for each direction. Linear predictive coding (LPC) with filtering provides a good decorrelation of Radon points using a Predictor constitute by the Symmetric Nearest Neighbor filter (SNN) coefficients, which result losses during decompression. Finally, Run Length Coding (RLC) gives us a high and fixed compression ratio regardless of the input image. In this paper, a novel image compression method based on the Radon transform and linear predictive coding (LPC) for MSG images is proposed. MSG image compression based on the Radon transform and the LPC provides a good compromise between compression and quality of reconstruction. A comparison of our method with other whose two based on DCT and one on DWT bi-orthogonal filtering is evaluated to show the power of the Radon transform in its resistibility against the quantization noise and to evaluate the performance of our method. Evaluation criteria like PSNR and the compression ratio allows showing the efficiency of our method of compression.Keywords: image compression, radon transform, linear predictive coding (LPC), run lengthcoding (RLC), meteosat second generation (MSG)
Procedia PDF Downloads 42530028 Artificial Bee Colony Optimization for SNR Maximization through Relay Selection in Underlay Cognitive Radio Networks
Authors: Babar Sultan, Kiran Sultan, Waseem Khan, Ijaz Mansoor Qureshi
Abstract:
In this paper, a novel idea for the performance enhancement of secondary network is proposed for Underlay Cognitive Radio Networks (CRNs). In Underlay CRNs, primary users (PUs) impose strict interference constraints on the secondary users (SUs). The proposed scheme is based on Artificial Bee Colony (ABC) optimization for relay selection and power allocation to handle the highlighted primary challenge of Underlay CRNs. ABC is a simple, population-based optimization algorithm which attains global optimum solution by combining local search methods (Employed and Onlooker Bees) and global search methods (Scout Bees). The proposed two-phase relay selection and power allocation algorithm aims to maximize the signal-to-noise ratio (SNR) at the destination while operating in an underlying mode. The proposed algorithm has less computational complexity and its performance is verified through simulation results for a different number of potential relays, different interference threshold levels and different transmit power thresholds for the selected relays.Keywords: artificial bee colony, underlay spectrum sharing, cognitive radio networks, amplify-and-forward
Procedia PDF Downloads 58430027 Integrating Data Envelopment Analysis and Variance Inflation Factor to Measure the Efficiency of Decision Making Units
Authors: Mostafa Kazemi, Zahra N. Farkhani
Abstract:
This paper proposes an integrated Data Envelopment Analysis (DEA) and Variance Inflation Factor (VIF) model for measuring the technical efficiency of decision making units. The model is validated using a set of 69% sales representatives’ dairy products. The analysis is done in two stages, in the first stage, VIF technique is used to distinguish independent effective factors of resellers, and in the second stage we used DEA for measuring efficiency for both constant and variable return to scales status. Further DEA is used to examine the utilization of environmental factors on efficiency. Results of this paper indicated an average managerial efficiency of 83% in the whole sales representatives’ dairy products. In addition, technical and scale efficiency were counted 96% and 80% respectively. 38% of sales representative have the technical efficiency of 100% and 72% of the sales representative in terms of managerial efficiency are quite efficient.High levels of relative efficiency indicate a good condition for sales representative efficiency.Keywords: data envelopment analysis (DEA), relative efficiency, sales representatives’ dairy products, variance inflation factor (VIF)
Procedia PDF Downloads 57430026 Dynamics Characterizations of Dielectric Electro- Active Polymer Pull Actuator for Vibration Control
Authors: A. M. Wahab, E. Rustighi
Abstract:
Elastomeric dielectric material has recently become a new alternative for actuator technology. The characteristics of dielectric elastomers placed between two electrodes to withstand large strain when electrodes are charged has attracted the attention of many researcher to study this material for actuator technology. Thus, in the past few years Danfoss Ventures A/S has established their own dielectric electro-active polymer (DEAP), which was called PolyPower. The main objective of this work was to investigate the dynamic characteristics for vibration control of a PolyPower actuator folded in ‘pull’ configuration. A range of experiments was carried out on the folded actuator including passive (without electrical load) and active (with electrical load) testing. For both categories static and dynamic testing have been done to determine the behavior of folded DEAP actuator. Voltage-Strain experiments show that the DEAP folded actuator is a non-linear system. It is also shown that the voltage supplied has no effect on the natural frequency. Finally, varying AC voltage with different amplitude and frequency shows the parameters that influence the performance of DEAP folded actuator. As a result, the actuator performance dominated by the frequency dependence of the elastic response and was less influenced by dielectric properties.Keywords: dielectric electro-active polymer, pull actuator, static, dynamic, electromechanical
Procedia PDF Downloads 25430025 Effect of Supplementing Different Sources and Levels of Phytase Enzyme to Diets on Productive Performance for Broiler Chickens
Authors: Sunbul Jassim Hamodi, Muna Khalid Khudayer, Firas Muzahem Hussein
Abstract:
The experiment was conducted to study the effect of supplement sources of Phytase enzyme (bacterial, fungal, enzymes mixture) using levels (250, 500, 750) FTY/ kg feed to diets compared with control on the performance for one thousand fifty broiler chicks (Ross 308) from 1day old with initial weight 39.78 gm till 42 days. The study involved 10 treatments, three replicates per treatment (35 chicks/replicate). Treatments were as follows: T1: control diet (without any addition). T2: added bacterial phytase enzyme 250FTY/ kg feed. T3: added bacterial phytase enzyme 500FTY/ kg feed. T4: added bacterial phytase enzyme 750FTY/ kg feed. T5: added fungal phytase enzyme 250FTY/ kg feed. T6: added fungal phytase enzyme 500FTY/ kg feed. T7: added fungal phytase enzyme 750FTY/ kg feed. T8 added enzymes mixture 250U/ kg feed. T9: added enzymes mixture 500U/ kg feed. T10: added enzymes mixture 750U/ kg feed. The results revealed that supplementing 750 U from enzymes mixture to broiler diet increased significantly (p <0.05) body weight compared with (250 FTY bacterial phytase/Kgfeed), (750 FTY bacterial phytase/Kg feed), (750FTY fungal phytase/Kgfeed) at 6 weeks, also supplemented different sources and levels from phytase enzyme improved a cumulative weight gain for (500 FTY bacterial phytase/Kgfeed), (250FTY fungal phytase/Kgfeed), (500FTY fungal phytase/Kgfeed), (250 Uenzymes mixture/Kgfeed), (500 Uenzymes mixture/Kgfeed) and (750 U enzymes mixture/Kgfeed) treatments compared with (750 FTY fungal phytase/Kgfeed)treatment, about accumulative feed consumption (500 FTY fungal phytase/Kgfeed) and (250 Uenzymes mixture/Kgfeed) increased significantly compared with control group and (750FTY fungal phytase/Kgfeed) during 1-6 weeks. There were significantly improved in cumulative feed conversion for (500U enzymes mixture/Kgfeed) compared with the worse feed conversion ratio that recorded in (250 FTY bacterial phytase/Kgfeed). No significant differences between treatments in internal organs relative weights, carcass cuts, dressing percentage and production index. Mortality was increased in (750FTY fungal phytase/Kgfeed) compared with other treatments.Keywords: phytase, phytic acid, broiler, productive performance
Procedia PDF Downloads 30330024 The Ultimate Scaling Limit of Monolayer Material Field-Effect-Transistors
Authors: Y. Lu, L. Liu, J. Guo
Abstract:
Monolayer graphene and dichaclogenide semiconductor materials attract extensive research interest for potential nanoelectronics applications. The ultimate scaling limit of double gate MoS2 Field-Effect-Transistors (FETs) with a monolayer thin body is examined and compared with ultra-thin-body Si FETs by using self-consistent quantum transport simulation in the presence of phonon scattering. Modelling of phonon scattering, quantum mechanical effects, and self-consistent electrostatics allows us to accurately assess the performance potential of monolayer MoS2 FETs. The results revealed that monolayer MoS2 FETs show 52% smaller Drain Induced Barrier Lowering (DIBL) and 13% Smaller Sub-Threshold Swing (SS) than 3 nm-thick-body Si FETs at a channel length of 10 nm with the same gating. With a requirement of SS<100mV/dec, the scaling limit of monolayer MoS2 FETs is assessed to be 5 nm, comparing with 8nm of the ultra-thin-body Si counterparts due to the monolayer thin body and higher effective mass which reduces direct source-to-drain tunnelling. By comparing with the ITRS target for high performance logic devices of 2023; double gate monolayer MoS2 FETs can fulfil the ITRS requirements.Keywords: nanotransistors, monolayer 2D materials, quantum transport, scaling limit
Procedia PDF Downloads 23930023 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study
Authors: Ana Serafimovic, Karthik Devarajan
Abstract:
Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence
Procedia PDF Downloads 24830022 Case Study of Human Factors and Ergonomics in the Design and Use of Harness-Embedded Costumes in the Entertainment Industry
Authors: Marielle Hanley, Brandon Takahashi, Gerry Hanley, Gabriella Hancock
Abstract:
Safety harnesses and their protocols are very common within the construction industry, and the Occupational Safety and Health Administration has provided extensive guidelines with protocols being constantly updated to ensure the highest level of safety within construction sites. There is also extensive research on harnesses that are meant to keep people in place in moving vehicles, such as seatbelts. Though this research is comprehensive in these areas, the findings and recommendations are not generally applicable to other industry sectors where harnesses are used, such as the entertainment industry. The focus of this case study is on the design and use of harnesses used by theme park employees wearing elaborate costumes in parades and performances. The key factors of posture, kinesthetic factors, and harness engineering interact in significantly different ways when the user is performing repetitive choreography with 20 to 40 lbs. of apparatus connected to harnesses that need to be hidden from the audience’s view. Human factors and ergonomic analysis take into account the required performers’ behaviors, the physical and mental preparation and posture of the performer, the design of the harness-embedded costume, and the environmental conditions during the performance (e.g., wind) that can determine the physical stresses placed on the harness and performer. The uniqueness and expense of elaborate costumes frequently result in one or two costumes created for production, and a variety of different performers need to fit into the same costume. Consequently, the harnesses should be adjustable if they are to minimize the physical and cognitive loads on the performer, but they are frequently more a “one-size fits all”. The complexity of human and technology interactions produces a range of detrimental outcomes, from muscle strains to nerve damage, mental and physical fatigue, and reduced motivation to perform at peak levels. Based on observations conducted over four years for this case study, a number of recommendations to institutionalize the human factors and ergonomic analyses can significantly improve the safety, reliability, and quality of performances with harness-embedded costumes in the entertainment industry. Human factors and ergonomic analyses can be integrated into the engineering design of the performance costumes with embedded harnesses, the conditioning and training of the performers using the costumes, the choreography of the performances within the staged setting and the maintenance of the harness-embedded costumes. By applying human factors and ergonomic methodologies in the entertainment industry, the industry management and support staff can significantly reduce the risks of injury, improve the longevity of unique performers, increase the longevity of the harness-embedded costumes, and produce the desired entertainment value for audiences.Keywords: ergonomics in entertainment industry, harness-embedded costumes, performer safety, injury prevention
Procedia PDF Downloads 9630021 Effects of Small Amount of Poly(D-Lactic Acid) on the Properties of Poly(L-Lactic Acid)/Microcrystalline Cellulose/Poly(D-Lactic Acid) Blends
Authors: Md. Hafezur Rahaman, Md. Sagor Hosen, Md. Abdul Gafur, Rasel Habib
Abstract:
This research is a systematic study of effects of poly(D-lactic acid) (PDLA) on the properties of poly(L-lactic acid)(PLLA)/microcrystalline cellulose (MCC)/PDLA blends by stereo complex crystallization. Blends were prepared with constant percentage of (3 percent) MCC and different percentage of PDLA by solution casting methods. These blends were characterized by Fourier Transform Infrared Spectroscopy (FTIR) for the confirmation of blends compatibility, Wide-Angle X-ray Scattering (WAXS) and scanning electron microscope (SEM) for the analysis of morphology, thermo-gravimetric analysis (TGA) and differential thermal analysis (DTA) for thermal properties measurement. FTIR Analysis results confirm no new characteristic absorption peaks appeared in the spectrum instead shifting of peaks due to hydrogen bonding help to have compatibility of blends component. Development of three new peaks from XRD analysis indicates strongly the formation of stereo complex crystallinity in the PLLA structure with the addition of PDLA. TGA and DTG results indicate that PDLA can improve the heat resistivity of the PLLA/MCC blends by increasing its degradation temperature. Comparison of DTA peaks also ensure developed thermal properties. Image of SEM shows the improvement of surface morphology.Keywords: microcrystalline cellulose, poly(l-lactic acid), stereocomplex crystallization, thermal stability
Procedia PDF Downloads 14230020 Identify the Renewable Energy Potential through Sustainability Indicators and Multicriteria Analysis
Authors: Camila Lima, Murilo Andrade Valle, Patrícia Teixeira Leite Asano
Abstract:
The growth in demand for electricity, caused by human development, depletion and environmental impacts caused by traditional sources of electricity generation have made new energy sources are increasingly encouraged and necessary for companies in the electricity sector. Based on this scenario, this paper assesses the negative environmental impacts associated with thermoelectric power plants in Brazil, pointing out the importance of using renewable energy sources, reducing environmental aggression. This article points out the existence of an energy alternative, wind energy, of the municipalities of São Paulo, represented by georeferenced maps with the help of GIS, using as a premise the indicators of sustainability and multicriteria analysis in the decision-making process.Keywords: GIS (geographic information systems), multicriteria analysis, sustainability, wind energy
Procedia PDF Downloads 36930019 The Non-Linear Analysis of Brain Response to Visual Stimuli
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 56530018 Intrusion Detection in Computer Networks Using a Hybrid Model of Firefly and Differential Evolution Algorithms
Authors: Mohammad Besharatloo
Abstract:
Intrusion detection is an important research topic in network security because of increasing growth in the use of computer network services. Intrusion detection is done with the aim of detecting the unauthorized use or abuse in the networks and systems by the intruders. Therefore, the intrusion detection system is an efficient tool to control the user's access through some predefined regulations. Since, the data used in intrusion detection system has high dimension, a proper representation is required to show the basis structure of this data. Therefore, it is necessary to eliminate the redundant features to create the best representation subset. In the proposed method, a hybrid model of differential evolution and firefly algorithms was employed to choose the best subset of properties. In addition, decision tree and support vector machine (SVM) are adopted to determine the quality of the selected properties. In the first, the sorted population is divided into two sub-populations. These optimization algorithms were implemented on these sub-populations, respectively. Then, these sub-populations are merged to create next repetition population. The performance evaluation of the proposed method is done based on KDD Cup99. The simulation results show that the proposed method has better performance than the other methods in this context.Keywords: intrusion detection system, differential evolution, firefly algorithm, support vector machine, decision tree
Procedia PDF Downloads 9730017 Static and Dynamical Analysis on Clutch Discs on Different Material and Geometries
Authors: Jairo Aparecido Martins, Estaner Claro Romão
Abstract:
This paper presents the static and cyclic stresses in combination with fatigue analysis resultant of loads applied on the friction discs usually utilized on industrial clutches. The material chosen to simulate the friction discs under load is aluminum. The numerical simulation was done by software COMSOLTM Multiphysics. The results obtained for static loads showed enough stiffness for both geometries and the material utilized. On the other hand, in the fatigue standpoint, failure is clearly verified, what demonstrates the importance of both approaches, mainly dynamical analysis. The results and the conclusion are based on the stresses on disc, counted stress cycles, and fatigue usage factor.Keywords: aluminum, industrial clutch, static and dynamic loading, numerical simulation
Procedia PDF Downloads 19330016 Hydrology and Hydraulics Analysis of Beko Abo Dam and Appurtenant Structre Design, Ethiopia
Authors: Azazhu Wassie
Abstract:
This study tried to evaluate the maximum design flood for appurtenance structure design using the given climatological and hydrological data analysis on the referenced study area. The maximum design flood is determined by using flood frequency analysis. Using this method, the peak discharge is 32,583.67 m3/s, but the data is transferred because the dam site is not on the gauged station. Then the peak discharge becomes 38,115 m3/s. The study was conducted in June 2023. This dam is built across a river to create a reservoir on its upstream side for impounding water. The water stored in the reservoir is used for various purposes, such as irrigation, hydropower, navigation, fishing, etc. The total average volume of annual runoff is estimated to be 115.1 billion m3. The total potential of the land for irrigation development can go beyond 3 million ha.Keywords: dam design, flow duration curve, peak flood, rainfall, reservoir capacity, risk and reliability
Procedia PDF Downloads 3330015 Structure Conduct and Performance of Rice Milling Industry in Sri Lanka
Authors: W. A. Nalaka Wijesooriya
Abstract:
The increasing paddy production, stabilization of domestic rice consumption and the increasing dynamism of rice processing and domestic markets call for a rethinking of the general direction of the rice milling industry in Sri Lanka. The main purpose of the study was to explore levels of concentration in rice milling industry in Polonnaruwa and Hambanthota which are the major hubs of the country for rice milling. Concentration indices reveal that the rice milling industry in Polonnaruwa operates weak oligopsony and is highly competitive in Hambanthota. According to the actual quantity of paddy milling per day, 47 % is less than 8Mt/Day, while 34 % is 8-20 Mt/day, and the rest (19%) is greater than 20 Mt/day. In Hambanthota, nearly 50% of the mills belong to the range of 8-20 Mt/day. Lack of experience of the milling industry, poor knowledge on milling technology, lack of capital and finding an output market are the major entry barriers to the industry. Major problems faced by all the rice millers are the lack of a uniform electricity supply and low quality paddy. Many of the millers emphasized that the rice ceiling price is a constraint to produce quality rice. More than 80% of the millers in Polonnaruwa which is the major parboiling rice producing area have mechanical dryers. Nearly 22% millers have modern machineries like color sorters, water jet polishers. Major paddy purchasing method of large scale millers in Polonnaruwa is through brokers. In Hambanthota major channel is miller purchasing from paddy farmers. Millers in both districts have major rice selling markets in Colombo and suburbs. Huge variation can be observed in the amount of pledge (for paddy storage) loans. There is a strong relationship among the storage ability, credit affordability and the scale of operation of rice millers. The inter annual price fluctuation ranged 30%-35%. Analysis of market margins by using series of secondary data shows that farmers’ share on rice consumer price is stable or slightly increases in both districts. In Hambanthota a greater share goes to the farmer. Only four mills which have obtained the Good Manufacturing Practices (GMP) certification from Sri Lanka Standards Institution can be found. All those millers are small quantity rice exporters. Priority should be given for the Small and medium scale millers in distribution of storage paddy of PMB during the off season. The industry needs a proper rice grading system, and it is recommended to introduce a ceiling price based on graded rice according to the standards. Both husk and rice bran were underutilized. Encouraging investment for establishing rice oil manufacturing plant in Polonnaruwa area is highly recommended. The current taxation procedure needs to be restructured in order to ensure the sustainability of the industry.Keywords: conduct, performance, structure (SCP), rice millers
Procedia PDF Downloads 33230014 Study of Mobile Game Addiction Using Electroencephalography Data Analysis
Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez
Abstract:
Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.Keywords: mobile game, addiction, psycho-physiology, EEG analysis
Procedia PDF Downloads 16930013 Analysis of the Extreme Hydrometeorological Events in the Theorical Hydraulic Potential and Streamflow Forecast
Authors: Sara Patricia Ibarra-Zavaleta, Rabindranarth Romero-Lopez, Rosario Langrave, Annie Poulin, Gerald Corzo, Mathias Glaus, Ricardo Vega-Azamar, Norma Angelica Oropeza
Abstract:
The progressive change in climatic conditions worldwide has increased frequency and severity of extreme hydrometeorological events (EHE). Mexico is an example; this has been affected by the presence of EHE leaving economic, social and environmental losses. The objective of this research was to apply a Canadian distributed hydrological model (DHM) to tropical conditions and to evaluate its capacity to predict flows in a basin in the central Gulf of Mexico. In addition, the DHM (once calibrated and validated) was used to calculate the theoretical hydraulic power and the performance to predict streamflow before the presence of an EHE. The results of the DHM show that the goodness of fit indicators between the observed and simulated flows in the calibration process (NSE=0.83, RSR=0.021 and BIAS=-4.3) and validation: temporal was assessed at two points: point one (NSE=0.78, RSR=0.113 and BIAS=0.054) and point two (NSE=0.825, RSR=0.103 and BIAS=0.063) are satisfactory. The DHM showed its applicability in tropical environments and its ability to characterize the rainfall-runoff relationship in the study area. This work can serve as a tool for identifying vulnerabilities before floods and for the rational and sustainable management of water resources.Keywords: HYDROTEL, hydraulic power, extreme hydrometeorological events, streamflow
Procedia PDF Downloads 345