Search results for: time series generation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22473

Search results for: time series generation

19983 Dependence of the Electro-Stimulation of Saccharomyces cerevisiae by Pulsed Electric Field at the Yeast Growth Phase

Authors: Jessy Mattar, Mohamad Turk, Maurice Nonus, Nikolai Lebovka, Henri El Zakhem, Eugene Vorobiev

Abstract:

The effects of electro-stimulation of S. cerevisiae cells in colloidal suspension by Pulsed Electric Fields ‎‎(PEF) with electric field strength E = 20 – 2000 V.cm-1 and effective PEF treatment time tPEF = 10^−5 – 1 s were ‎investigated. The applied experimental procedure includes variations in the preliminary fermentation time and ‎electro-stimulation by PEF-treatment. Plate counting was performed.‎ At relatively high electric fields (E ≥ 1000 V.cm-1) and moderate PEF treatment time (tPEF > 100 µs), the ‎extraction of ionic components from yeast was observed by conductivity measurements, which can be related to ‎electroporation of cell membranes. Cell counting revealed a dependency of the colonies’ size on the time of ‎preliminary fermentation tf and the power consumption W, however no dependencies were noticeable by varying the initial yeast concentration in the treated suspensions.‎

Keywords: intensification, yeast, fermentation, electroporation, biotechnology

Procedia PDF Downloads 472
19982 Management and Evaluation of Developing Medical Device Software in Compliance with Rules

Authors: Arash Sepehri bonab

Abstract:

One of the regions of critical development in medical devices has been the part of the software - as an indispensable component of a therapeutic device, as a standalone device, and more as of late, as applications on portable gadgets. The chance related to a breakdown of the standalone computer program utilized inside healthcare is in itself not a model for its capability or not as a medical device. It is, subsequently, fundamental to clarify a few criteria for the capability of a stand-alone computer program as a medical device. The number of computer program items and therapeutic apps is persistently expanding and so as well is used in wellbeing education (e. g., in clinics and doctors' surgeries) for determination and treatment. Within the last decade, the use of information innovation in healthcare has taken a developing part. In reality, the appropriation of an expanding number of computer devices has driven several benefits related to the method of quiet care and permitted simpler get to social and health care assets. At the same time, this drift gave rise to modern challenges related to the usage of these modern innovations. The program utilized in healthcare can be classified as therapeutic gadgets depending on the way they are utilized and on their useful characteristics. In the event that they are classified as therapeutic gadgets, they must fulfill particular directions. The point of this work is to show a computer program improvement system that can permit the generation of secure and tall, quality restorative gadget computer programs and to highlight the correspondence between each program advancement stage and the fitting standard and/or regulation.

Keywords: medical devices, regulation, software, development, healthcare

Procedia PDF Downloads 112
19981 A Study on Aquatic Bycatch Mortality Estimation Due to Prawn Seed Collection and Alteration of Collection Method through Sustainable Practices in Selected Areas of Sundarban Biosphere Reserve (SBR), India

Authors: Samrat Paul, Satyajit Pahari, Krishnendu Basak, Amitava Roy

Abstract:

Fishing is one of the pivotal livelihood activities, especially in developing countries. Today it is considered an important occupation for human society from the era of human settlement began. In simple terms, non-target catches of any species during fishing can be considered as ‘bycatch,’ and fishing bycatch is neither a new fishery management issue nor a new problem. Sundarban is one of the world’s largest mangrove land expanding up to 10,200 sq. km in India and Bangladesh. This largest mangrove biome resource is used by the local inhabitants commercially to run their livelihood, especially by forest fringe villagers (FFVs). In Sundarban, over-fishing, especially post larvae collection of wild Penaeus monodon, is one of the major concerns, as during the collection of P. monodon, different aquatic species are destroyed as a result of bycatch mortality which changes in productivity and may negatively impact entire biodiversity, of the ecosystem. Wild prawn seed collection gear like a small mesh sized net poses a serious threat to aquatic stocks, where the collection isn’t only limited to prawn seed larvae. As prawn seed collection processes are inexpensive, require less monetary investment, and are lucrative; people are easily engaged here as their source of income. Wildlife Trust of India’s (WTI) intervention in selected forest fringe villages of Sundarban Tiger Reserve (STR) was to estimate and reduce the mortality of aquatic bycatches by involving local communities in newly developed release method and their time engagement in prawn seed collection (PSC) by involving them in Alternate Income Generation (AIG). The study was conducted for their taxonomic identification during the period of March to October 2019. Collected samples were preserved in 70% ethyl alcohol for identification, and all the preserved bycatch samples were identified morphologically by the expertise of the Zoological Survey of India (ZSI), Kolkata. Around 74 different aquatic species, where 11 different species are molluscs, 41 fish species, out of which 31 species were identified, and 22 species of crustacean collected, out of which 18 species were identified. Around 13 different species belong to a different order, and families were unable to identify them morphologically as they were collected in the juvenile stage. The study reveals that for collecting one single prawn seed, eight individual life of associated faunas are being lost. Zero bycatch mortality is not practical; rather, collectors should focus on bycatch reduction by avoiding capturing, allowing escaping, and mortality reduction, and must make changes in their fishing method by increasing net mesh size, which will avoid non-target captures. But as the prawns are small in size (generally 1-1.5 inches in length), thus increase net size making economically less or no profit for collectors if they do so. In this case, returning bycatches is considered one of the best ways to a reduction in bycatch mortality which is a more sustainable practice.

Keywords: bycatch mortality, biodiversity, mangrove biome resource, sustainable practice, Alternate Income Generation (AIG)

Procedia PDF Downloads 156
19980 Parallel Vector Processing Using Multi Level Orbital DATA

Authors: Nagi Mekhiel

Abstract:

Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.

Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing

Procedia PDF Downloads 273
19979 Generation of Knowlege with Self-Learning Methods for Ophthalmic Data

Authors: Klaus Peter Scherer, Daniel Knöll, Constantin Rieder

Abstract:

Problem and Purpose: Intelligent systems are available and helpful to support the human being decision process, especially when complex surgical eye interventions are necessary and must be performed. Normally, such a decision support system consists of a knowledge-based module, which is responsible for the real assistance power, given by an explanation and logical reasoning processes. The interview based acquisition and generation of the complex knowledge itself is very crucial, because there are different correlations between the complex parameters. So, in this project (semi)automated self-learning methods are researched and developed for an enhancement of the quality of such a decision support system. Methods: For ophthalmic data sets of real patients in a hospital, advanced data mining procedures seem to be very helpful. Especially subgroup analysis methods are developed, extended and used to analyze and find out the correlations and conditional dependencies between the structured patient data. After finding causal dependencies, a ranking must be performed for the generation of rule-based representations. For this, anonymous patient data are transformed into a special machine language format. The imported data are used as input for algorithms of conditioned probability methods to calculate the parameter distributions concerning a special given goal parameter. Results: In the field of knowledge discovery advanced methods and applications could be performed to produce operation and patient related correlations. So, new knowledge was generated by finding causal relations between the operational equipment, the medical instances and patient specific history by a dependency ranking process. After transformation in association rules logically based representations were available for the clinical experts to evaluate the new knowledge. The structured data sets take account of about 80 parameters as special characteristic features per patient. For different extended patient groups (100, 300, 500), as well one target value as well multi-target values were set for the subgroup analysis. So the newly generated hypotheses could be interpreted regarding the dependency or independency of patient number. Conclusions: The aim and the advantage of such a semi-automatically self-learning process are the extensions of the knowledge base by finding new parameter correlations. The discovered knowledge is transformed into association rules and serves as rule-based representation of the knowledge in the knowledge base. Even more, than one goal parameter of interest can be considered by the semi-automated learning process. With ranking procedures, the most strong premises and also conjunctive associated conditions can be found to conclude the interested goal parameter. So the knowledge, hidden in structured tables or lists can be extracted as rule-based representation. This is a real assistance power for the communication with the clinical experts.

Keywords: an expert system, knowledge-based support, ophthalmic decision support, self-learning methods

Procedia PDF Downloads 254
19978 Numerical Solutions of Generalized Burger-Fisher Equation by Modified Variational Iteration Method

Authors: M. O. Olayiwola

Abstract:

Numerical solutions of the generalized Burger-Fisher are obtained using a Modified Variational Iteration Method (MVIM) with minimal computational efforts. The computed results with this technique have been compared with other results. The present method is seen to be a very reliable alternative method to some existing techniques for such nonlinear problems.

Keywords: burger-fisher, modified variational iteration method, lagrange multiplier, Taylor’s series, partial differential equation

Procedia PDF Downloads 434
19977 Assessment of Multi-Domain Energy Systems Modelling Methods

Authors: M. Stewart, Ameer Al-Khaykan, J. M. Counsell

Abstract:

Emissions are a consequence of electricity generation. A major option for low carbon generation, local energy systems featuring Combined Heat and Power with solar PV (CHPV) has significant potential to increase energy performance, increase resilience, and offer greater control of local energy prices while complementing the UK’s emissions standards and targets. Recent advances in dynamic modelling and simulation of buildings and clusters of buildings using the IDEAS framework have successfully validated a novel multi-vector (simultaneous control of both heat and electricity) approach to integrating the wide range of primary and secondary plant typical of local energy systems designs including CHP, solar PV, gas boilers, absorption chillers and thermal energy storage, and associated electrical and hot water networks, all operating under a single unified control strategy. Results from this work indicate through simulation that integrated control of thermal storage can have a pivotal role in optimizing system performance well beyond the present expectations. Environmental impact analysis and reporting of all energy systems including CHPV LES presently employ a static annual average carbon emissions intensity for grid supplied electricity. This paper focuses on establishing and validating CHPV environmental performance against conventional emissions values and assessment benchmarks to analyze emissions performance without and with an active thermal store in a notional group of non-domestic buildings. Results of this analysis are presented and discussed in context of performance validation and quantifying the reduced environmental impact of CHPV systems with active energy storage in comparison with conventional LES designs.

Keywords: CHPV, thermal storage, control, dynamic simulation

Procedia PDF Downloads 247
19976 Comprehensive Longitudinal Multi-omic Profiling in Weight Gain and Insulin Resistance

Authors: Christine Y. Yeh, Brian D. Piening, Sarah M. Totten, Kimberly Kukurba, Wenyu Zhou, Kevin P. F. Contrepois, Gucci J. Gu, Sharon Pitteri, Michael Snyder

Abstract:

Three million deaths worldwide are attributed to obesity. However, the biomolecular mechanisms that describe the link between adiposity and subsequent disease states are poorly understood. Insulin resistance characterizes approximately half of obese individuals and is a major cause of obesity-mediated diseases such as Type II diabetes, hypertension and other cardiovascular diseases. This study makes use of longitudinal quantitative and high-throughput multi-omics (genomics, epigenomics, transcriptomics, glycoproteomics etc.) methodologies on blood samples to develop multigenic and multi-analyte signatures associated with weight gain and insulin resistance. Participants of this study underwent a 30-day period of weight gain via excessive caloric intake followed by a 60-day period of restricted dieting and return to baseline weight. Blood samples were taken at three different time points per patient: baseline, peak-weight and post weight loss. Patients were characterized as either insulin resistant (IR) or insulin sensitive (IS) before having their samples processed via longitudinal multi-omic technologies. This comparative study revealed a wealth of biomolecular changes associated with weight gain after using methods in machine learning, clustering, network analysis etc. Pathways of interest included those involved in lipid remodeling, acute inflammatory response and glucose metabolism. Some of these biomolecules returned to baseline levels as the patient returned to normal weight whilst some remained elevated. IR patients exhibited key differences in inflammatory response regulation in comparison to IS patients at all time points. These signatures suggest differential metabolism and inflammatory pathways between IR and IS patients. Biomolecular differences associated with weight gain and insulin resistance were identified on various levels: in gene expression, epigenetic change, transcriptional regulation and glycosylation. This study was not only able to contribute to new biology that could be of use in preventing or predicting obesity-mediated diseases, but also matured novel biomedical informatics technologies to produce and process data on many comprehensive omics levels.

Keywords: insulin resistance, multi-omics, next generation sequencing, proteogenomics, type ii diabetes

Procedia PDF Downloads 432
19975 A Priority Based Imbalanced Time Minimization Assignment Problem: An Iterative Approach

Authors: Ekta Jain, Kalpana Dahiya, Vanita Verma

Abstract:

This paper discusses a priority based imbalanced time minimization assignment problem dealing with the allocation of n jobs to m < n persons in which the project is carried out in two stages, viz. Stage-I and Stage-II. Stage-I consists of n1 ( < m) primary jobs and Stage-II consists of remaining (n-n1) secondary jobs which are commenced only after primary jobs are finished. Each job is to be allocated to exactly one person, and each person has to do at least one job. It is assumed that nature of the Stage-I jobs is such that one person can do exactly one primary job whereas a person can do more than one secondary job in Stage-II. In a particular stage, all persons start doing the jobs simultaneously, but if a person is doing more than one job, he does them one after the other in any order. The aim of the proposed study is to find the feasible assignment which minimizes the total time for the two stage execution of the project. For this, an iterative algorithm is proposed, which at each iteration, solves a constrained imbalanced time minimization assignment problem to generate a pair of Stage-I and Stage-II times. For solving this constrained problem, an algorithm is developed in the current paper. Later, alternate combinations based method to solve the priority based imbalanced problem is also discussed and a comparative study is carried out. Numerical illustrations are provided in support of the theory.

Keywords: assignment, imbalanced, priority, time minimization

Procedia PDF Downloads 238
19974 Naked Machismo: Uncovered Masculinity in an Israeli Home Design Campaign

Authors: Gilad Padva, Sigal Barak Brandes

Abstract:

This research centers on an unexpected Israeli advertising campaign for Elemento, a local furniture company, which eroticizes male nudity. The discussed campaign includes a series of printed ads that depict naked male models in effeminate positions. This campaign included a series of ads published in Haaretz, a small-scaled yet highly prestigious daily newspaper which is typically read by urban middle-upper-class left-winged Israelis. Apparently, this campaign embodies an alternative masculinity that challenges the prevalent machismo in Israeli society and advertising. Although some of the ads focus on young men in effeminate positions, they never expose their genitals and anuses, and their bodies are never permeable. The 2010s Elemento male models are seemingly contrasted to conventional representation of manhood in contemporary mainstream advertising. They display a somewhat inactive, passive and self-indulgent masculinity which involves 'conspicuous leisure'. In the process of commodity fetishism, the advertised furniture are emptied of the original meaning of their production, and then filled with new meanings in ways that both mystify the product and turn it into a fetish object. Yet, our research critically reconsiders this sensational campaign as sophisticated patriarchal parody that does not subvert but rather reconfirms and even fetishizes patriarchal premises; it parodizes effeminacy rather than the prevalent (Israeli) machismo. Following Pierre Bourdieu's politics of cultural taste, our research reconsiders and criticizes the male models' domesticated masculinity in a fantasized and cosmopolitan hedonistic habitus. Notwithstanding, we suggest that the Elemento campaign, despite its conformity, does question some Israeli and global axioms about gender roles, corporeal ideologies, idealized bodies, and domesticated phalluses and anuses. Although the naked truth is uncovered by this campaign, it does erect a vibrant discussion of contemporary masculinities and their exploitation in current mass consumption.

Keywords: male body, campaign, advertising, gender studies, men's studies, Israeli culture, masculinity, parody, effeminacy

Procedia PDF Downloads 216
19973 Analysis of Spatiotemporal Efficiency and Fairness of Railway Passenger Transport Network Based on Space Syntax: Taking Yangtze River Delta as an Example

Authors: Lin Dong, Fei Shi

Abstract:

Based on the railway network and the principles of space syntax, the study attempts to reconstruct the spatial relationship of the passenger network connections from space and time perspective. According to the travel time data of main stations in the Yangtze River Delta urban agglomeration obtained by the Internet, the topological drawing of railway network under different time sections is constructed. With the comprehensive index composed of connection and integration, the accessibility and network operation efficiency of the railway network in different time periods is calculated, while the fairness of the network is analyzed by the fairness indicators constructed with the integration and location entropy from the perspective of horizontal and vertical fairness respectively. From the analysis of the efficiency and fairness of the railway passenger transport network, the study finds: (1) There is a strong regularity in regional system accessibility change; (2) The problems of efficiency and fairness are different in different time periods; (3) The improvement of efficiency will lead to the decline of horizontal fairness to a certain extent, while from the perspective of vertical fairness, the supply-demand situation has changed smoothly with time; (4) The network connection efficiency of Shanghai, Jiangsu and Zhejiang regions is higher than that of the western regions such as Anqing and Chizhou; (5) The marginalization of Nantong, Yancheng, Yangzhou, Taizhou is obvious. The study explores the application of spatial syntactic theory in regional traffic analysis, in order to provide a reference for the development of urban agglomeration transportation network.

Keywords: spatial syntax, the Yangtze River Delta, railway passenger time, efficiency and fairness

Procedia PDF Downloads 139
19972 Synthesis of Modified Cellulose for the Capture of Uranyl Ions from Aqueous Solutions

Authors: Claudia Vergara, Oscar Valdes, Jaime Tapia, Leonardo Santos

Abstract:

The poly(amidoamine) dendrimers (PAMAM) are a class of material introduced by D. Tomalia. Modifications of the PAMAM dendrimer with several functional groups have attracted the attention for new interesting properties and new applications in many fields such as chemistry, physics, biology, and medicine. However, in the last few years, the use of dendrimers in environmental applications has increased due to pollution concerns. In this contribution, we report the synthesis of three new PAMAM derivates modified with asparagine aminoacid supported in cellulose: PG0-Asn (PAMAM-asparagine), PG0-Asn-Trt (with trityl group) and PG0-Asn-Boc-Trt (with tert-butyl oxycarbonyl group). The functionalization of generation 0 PAMAM dendrimer was carried out by amidation reaction by using an EDC/HOBt protocol. In a second step, functionalized dendrimer was covalently supported to the cellulose surface and used to study the capture of uranyl ions from aqueous solution by fluorescence spectroscopy. The structure and purity of the desired products were confirmed by conventional techniques such as FT-IR, MALDI, elemental analysis, and ESI-MS. Batch experiments were carried out to determine the affinity of uranyl ions with the dendrimer in aqueous solution. Firstly, the optimal conditions for uranyl capture were obtained, where the optimum pH for the removal was 6, the contact time was 4 hours, the initial concentration of uranyl was 100 ppm, and the amount of the adsorbent to be used was 2.5 mg. PAMAM significantly increased the capture of uranyl ions with respect to cellulose as the starting substrate, reaching 94.8% of capture (PG0), followed by 91.2% corresponding to PG0-Asn-Trt, then 70.3% PG0-Asn and 24.2% PG0-Asn-Boc-Trt. These results show that the PAMAM dendrimer is a good option to remove uranyl ions from aqueous solutions.

Keywords: asparagine, cellulose, PAMAM dendrimer, uranyl ions

Procedia PDF Downloads 142
19971 Principal Creative Leadership for Teacher Learning and School Culture

Authors: Yashi Ye

Abstract:

Principles play vital roles in shaping the school culture and promoting teachers' professional learning by exerting their leadership. In the changing time of the 21st century, the creative leadership of school leaders is increasingly important in cultivating the professional learning communities of teachers for eventually improving student performance in every continent. This study examines under what conditions and how principal creative leadership contributes to teachers’ professional learning and school culture. Data collected from 632 teachers in 30 primary and middle schools in the cities of Chengdu and Chongqing in mainland China are analyzed using structural equation modeling and bootstrapping tests. A moderated mediation model of principle creative leadership effects is used to analyze professional teacher learning and school culture in which the mediator will be school culture and the moderator will be power distance orientation. The results indicate that principal creative leadership has significant direct and indirect effects on teachers' professional learning. A positive correlation between principal creative leadership, professional teacher learning, and school culture is observed. Further model testing found that teacher power distance orientation moderated the significant effect of principal creative leadership on school culture. When teachers perceived higher power distance in teacher-principal relations, the effects of principal creative leadership were stronger than for those who perceived low power distance. The results indicate the “culture change” in the young generation of teachers in China, and further implications to understanding the cultural context in the field of educational leadership are discussed.

Keywords: power distance orientation, principal creative leadership, school culture, teacher professional learning

Procedia PDF Downloads 145
19970 Enhancing the Stability of Vietnamese Power System - from Theory to Practical

Authors: Edwin Lerch, Dirk Audring, Cuong Nguyen Mau, Duc Ninh Nguyen, The Cuong Nguyen, The Van Nguyen

Abstract:

The National Load Dispatch Centre of Electricity Vietnam (EVNNLDC) and Siemens PTI investigated the stability of the electrical 500/220 kV transportation system of Vietnam. The general scope of the investigations is improving the stability of the Vietnam power system and giving the EVNNLDC staff the capability to decide how to deal with expected stability challenges in the future, which are related to the very fast growth of the system. Rapid system growth leads to a very high demand of power transmission from North to South. This was investigated by stability studies of interconnected power system with neighboring countries. These investigations are performed in close cooperation and coordination with the EVNNLDC project team. This important project includes data collection, measurement, model validation and investigation of relevant stability phenomena as well as training of the EVNNLDC staff. Generally, the power system of Vietnam has good voltage and dynamic stability. The main problems are related to the longitudinal system with more power generation in the North and Center, especially hydro power, and load centers in the South of Vietnam. Faults on the power transmission system from North to South risks the stability of the entire system due to a high power transfer from North to South and high loading of the 500 kV backbone. An additional problem is the weak connection to Cambodia power system which leads to interarea oscillations mode. Therefore, strengthening the power transfer capability by new 500kV lines or HVDC connection and balancing the power generation across the country will solve many challenges. Other countermeasures, such as wide area load shedding, PSS tuning and correct SVC placement will improve and stabilize the power system as well. Primary frequency reserve should be increased.

Keywords: dynamic power transmission system studies, blackout prevention, power system interconnection, stability

Procedia PDF Downloads 368
19969 Text Mining of Veterinary Forums for Epidemiological Surveillance Supplementation

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Web scraping and text mining are popular computer science methods deployed by public health researchers to augment traditional epidemiological surveillance. However, within veterinary disease surveillance, such techniques are still in the early stages of development and have not yet been fully utilised. This study presents an exploration into the utility of incorporating internet-based data to better understand the smallholder farming communities within Scotland by using online text extraction and the subsequent mining of this data. Web scraping of the livestock fora was conducted in conjunction with text mining of the data in search of common themes, words, and topics found within the text. Results from bi-grams and topic modelling uncover four main topics of interest within the data pertaining to aspects of livestock husbandry: feeding, breeding, slaughter, and disposal. These topics were found amongst both the poultry and pig sub-forums. Topic modeling appears to be a useful method of unsupervised classification regarding this form of data, as it has produced clusters that relate to biosecurity and animal welfare. Internet data can be a very effective tool in aiding traditional veterinary surveillance methods, but the requirement for human validation of said data is crucial. This opens avenues of research via the incorporation of other dynamic social media data, namely Twitter and Facebook/Meta, in addition to time series analysis to highlight temporal patterns.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, smallholding, social media, web scraping, sentiment analysis, geolocation, text mining, NLP

Procedia PDF Downloads 105
19968 An Analysis of The Philippines' Legal Transition from Open Dumpsites to Solid Waste Management Facilities

Authors: Mary Elenor Adagio, John Roben Ambas, Ramilyn Bertolano, Julie Ann Garcia

Abstract:

Ecological Solid Waste Management has been a long-time concern in both national and international spheres. The exponential growth of waste generation is not properly matched with a waste management system that is cost-effective. As a result, governments and their communities within inevitably resort to the old ways of opening dumpsites to serve as a giant garbage bin. However, due to the environmental and public health problems these unmanaged dumpsites caused, countries like the Philippines mandated the closure of these dumpsites and converted them into or opened new sanitary landfills. This study aims to determine how the transition from open dumpsites to Solid Waste Management Facilities improve the implementation of the Solid Waste Management Framework of the government pursuant to Republic Act 9003. To test the hypothesis that the mandatory closure of dumpsites is better in the management of wastes in local government units, a review of related literature on analysis reports, news, and case studies was conducted. The results suggest that advocating for the transition of dumpsites to sanitary landfills would not only prevent environmental risks caused by pollution but also reduce problems regarding public health. Although this transition can be effective, data also show that with a lack of funding and resources, many local government units still find it difficult to provide their solid waste management plans and to adapt to the transition to sanitary landfills.

Keywords: solid waste management, environmental law, solid waste management facilities, open dumpsites

Procedia PDF Downloads 166
19967 Scheduling Residential Daily Energy Consumption Using Bi-criteria Optimization Methods

Authors: Li-hsing Shih, Tzu-hsun Yen

Abstract:

Because of the long-term commitment to net zero carbon emission, utility companies include more renewable energy supply, which generates electricity with time and weather restrictions. This leads to time-of-use electricity pricing to reflect the actual cost of energy supply. From an end-user point of view, better residential energy management is needed to incorporate the time-of-use prices and assist end users in scheduling their daily use of electricity. This study uses bi-criteria optimization methods to schedule daily energy consumption by minimizing the electricity cost and maximizing the comfort of end users. Different from most previous research, this study schedules users’ activities rather than household appliances to have better measures of users’ comfort/satisfaction. The relation between each activity and the use of different appliances could be defined by users. The comfort level is at the highest when the time and duration of an activity completely meet the user’s expectation, and the comfort level decreases when the time and duration do not meet expectations. A questionnaire survey was conducted to collect data for establishing regression models that describe users’ comfort levels when the execution time and duration of activities are different from user expectations. Six regression models representing the comfort levels for six types of activities were established using the responses to the questionnaire survey. A computer program is developed to evaluate electricity cost and the comfort level for each feasible schedule and then find the non-dominated schedules. The Epsilon constraint method is used to find the optimal schedule out of the non-dominated schedules. A hypothetical case is presented to demonstrate the effectiveness of the proposed approach and the computer program. Using the program, users can obtain the optimal schedule of daily energy consumption by inputting the intended time and duration of activities and the given time-of-use electricity prices.

Keywords: bi-criteria optimization, energy consumption, time-of-use price, scheduling

Procedia PDF Downloads 65
19966 Comparison of Due Date Assignment Rules in a Dynamic Job Shop

Authors: Mumtaz Ipek, Burak Erkayman

Abstract:

Due date is assigned as an input for scheduling problems. At the same time, due date is selected as a decision variable for real time scheduling applications. Correct determination of due dates increases shop floor performance and number of jobs completed on time. This subject has been mentioned widely in the literature. Moreover rules for due date determination have been developed from analytical analysis. When a job arrives to the shop floor, a due date is assigned for delivery. Various due date determination methods are used in the literature. In this study six different due date methods are implemented for a hypothetical dynamic job shop and the performances of the due date methods are compared.

Keywords: scheduling, dynamic job shop, due date assignment, management engineering

Procedia PDF Downloads 557
19965 The Analysis of Defects Prediction in Injection Molding

Authors: Mehdi Moayyedian, Kazem Abhary, Romeo Marian

Abstract:

This paper presents an evaluation of a plastic defect in injection molding before it occurs in the process; it is known as the short shot defect. The evaluation of different parameters which affect the possibility of short shot defect is the aim of this paper. The analysis of short shot possibility is conducted via SolidWorks Plastics and Taguchi method to determine the most significant parameters. Finite Element Method (FEM) is employed to analyze two circular flat polypropylene plates of 1 mm thickness. Filling time, part cooling time, pressure holding time, melt temperature and gate type are chosen as process and geometric parameters, respectively. A methodology is presented herein to predict the possibility of the short-shot occurrence. The analysis determined melt temperature is the most influential parameter affecting the possibility of short shot defect with a contribution of 74.25%, and filling time with a contribution of 22%, followed by gate type with a contribution of 3.69%. It was also determined the optimum level of each parameter leading to a reduction in the possibility of short shot are gate type at level 1, filling time at level 3 and melt temperature at level 3. Finally, the most significant parameters affecting the possibility of short shot were determined to be melt temperature, filling time, and gate type.

Keywords: injection molding, plastic defects, short shot, Taguchi method

Procedia PDF Downloads 220
19964 Lithium Ion Supported on TiO2 Mixed Metal Oxides as a Heterogeneous Catalyst for Biodiesel Production from Canola Oil

Authors: Mariam Alsharifi, Hussein Znad, Ming Ang

Abstract:

Considering the environmental issues and the shortage in the conventional fossil fuel sources, biodiesel has gained a promising solution to shift away from fossil based fuel as one of the sustainable and renewable energy. It is synthesized by transesterification of vegetable oils or animal fats with alcohol (methanol or ethanol) in the presence of a catalyst. This study focuses on synthesizing a high efficient Li/TiO2 heterogeneous catalyst for biodiesel production from canola oil. In this work, lithium immobilized onto TiO2 by the simple impregnation method. The catalyst was evaluated by transesterification reaction in a batch reactor under moderate reaction conditions. To study the effect of Li concentrations, a series of LiNO3 concentrations (20, 30, 40 wt. %) at different calcination temperatures (450, 600, 750 ºC) were evaluated. The Li/TiO2 catalysts are characterized by several spectroscopic and analytical techniques such as XRD, FT-IR, BET, TG-DSC and FESEM. The optimum values of impregnated Lithium nitrate on TiO2 and calcination temperature are 30 wt. % and 600 ºC, respectively, along with a high conversion to be 98 %. The XRD study revealed that the insertion of Li improved the catalyst efficiency without any alteration in structure of TiO2 The best performance of the catalyst was achieved when using a methanol to oil ratio of 24:1, 5 wt. % of catalyst loading, at 65◦C reaction temperature for 3 hours of reaction time. Moreover, the experimental kinetic data were compatible with the pseudo-first order model and the activation energy was (39.366) kJ/mol. The synthesized catalyst Li/TiO2 was applied to trans- esterify used cooking oil and exhibited a 91.73% conversion. The prepared catalyst has shown a high catalytic activity to produce biodiesel from fresh and used oil within mild reaction conditions.

Keywords: biodiesel, canola oil, environment, heterogeneous catalyst, impregnation method, renewable energy, transesterification

Procedia PDF Downloads 180
19963 Titanium-Aluminium Oxide Coating on Aluminized Steel

Authors: Fuyan Sun, Guang Wang, Xueyuan Nie

Abstract:

In this study, a plasma electrolytic oxidation (PEO) process was used to form titanium-aluminium oxide coating on aluminized steel. The present work was mainly to study the effects of treatment time of PEO process on properties of the titanium coating. A potentiodynamic polarization corrosion test was employed to investigate the corrosion resistance of the coating. The friction coefficient and wear resistance of the coating were studied by using pin-on-disc test. The thermal transfer behaviours of uncoated and PEO-coated aluminized steels were also studied. It could be seen that treatment time of PEO process significantly influenced the properties of the titanium oxide coating. Samples with a longer treatment time had a better performance for corrosion and wear protection. This paper demonstrated different treatment time could alter the surface behaviour of the coating material.

Keywords: titanium-aluminum oxide, plasma electrolytic oxidation, corrosion, wear, thermal property

Procedia PDF Downloads 361
19962 A Comparative Study of GTC and PSP Algorithms for Mining Sequential Patterns Embedded in Database with Time Constraints

Authors: Safa Adi

Abstract:

This paper will consider the problem of sequential mining patterns embedded in a database by handling the time constraints as defined in the GSP algorithm (level wise algorithms). We will compare two previous approaches GTC and PSP, that resumes the general principles of GSP. Furthermore this paper will discuss PG-hybrid algorithm, that using PSP and GTC. The results show that PSP and GTC are more efficient than GSP. On the other hand, the GTC algorithm performs better than PSP. The PG-hybrid algorithm use PSP algorithm for the two first passes on the database, and GTC approach for the following scans. Experiments show that the hybrid approach is very efficient for short, frequent sequences.

Keywords: database, GTC algorithm, PSP algorithm, sequential patterns, time constraints

Procedia PDF Downloads 393
19961 Processes and Application of Casting Simulation and Its Software’s

Authors: Surinder Pal, Ajay Gupta, Johny Khajuria

Abstract:

Casting simulation helps visualize mold filling and casting solidification; predict related defects like cold shut, shrinkage porosity and hard spots; and optimize the casting design to achieve the desired quality with high yield. Flow and solidification of molten metals are, however, a very complex phenomenon that is difficult to simulate correctly by conventional computational techniques, especially when the part geometry is intricate and the required inputs (like thermo-physical properties and heat transfer coefficients) are not available. Simulation software is based on the process of modeling a real phenomenon with a set of mathematical formulas. It is, essentially, a program that allows the user to observe an operation through simulation without actually performing that operation. Simulation software is used widely to design equipment so that the final product will be as close to design specs as possible without expensive in process modification. Simulation software with real-time response is often used in gaming, but it also has important industrial applications. When the penalty for improper operation is costly, such as airplane pilots, nuclear power plant operators, or chemical plant operators, a mockup of the actual control panel is connected to a real-time simulation of the physical response, giving valuable training experience without fear of a disastrous outcome. The all casting simulation software has own requirements, like magma cast has only best for crack simulation. The latest generation software Auto CAST developed at IIT Bombay provides a host of functions to support method engineers, including part thickness visualization, core design, multi-cavity mold design with common gating and feeding, application of various feed aids (feeder sleeves, chills, padding, etc.), simulation of mold filling and casting solidification, automatic optimization of feeders and gating driven by the desired quality level, and what-if cost analysis. IIT Bombay has developed a set of applications for the foundry industry to improve casting yield and quality. Casting simulation is a fast and efficient solution for process for advanced tool which is the result of more than 20 years of collaboration with major industrial partners and academic institutions around the world. In this paper the process of casting simulation is studied.

Keywords: casting simulation software’s, simulation technique’s, casting simulation, processes

Procedia PDF Downloads 478
19960 The Processing of Context-Dependent and Context-Independent Scalar Implicatures

Authors: Liu Jia’nan

Abstract:

The default accounts hold the view that there exists a kind of scalar implicature which can be processed without context and own a psychological privilege over other scalar implicatures which depend on context. In contrast, the Relevance Theorist regards context as a must because all the scalar implicatures have to meet the need of relevance in discourse. However, in Katsos, the experimental results showed: Although quantitatively the adults rejected under-informative utterance with lexical scales (context-independent) and the ad hoc scales (context-dependent) at almost the same rate, adults still regarded the violation of utterance with lexical scales much more severe than with ad hoc scales. Neither default account nor Relevance Theory can fully explain this result. Thus, there are two questionable points to this result: (1) Is it possible that the strange discrepancy is due to other factors instead of the generation of scalar implicature? (2) Are the ad hoc scales truly formed under the possible influence from mental context? Do the participants generate scalar implicatures with ad hoc scales instead of just comparing semantic difference among target objects in the under- informative utterance? In my Experiment 1, the question (1) will be answered by repetition of Experiment 1 by Katsos. Test materials will be showed by PowerPoint in the form of pictures, and each procedure will be done under the guidance of a tester in a quiet room. Our Experiment 2 is intended to answer question (2). The test material of picture will be transformed into the literal words in DMDX and the target sentence will be showed word-by-word to participants in the soundproof room in our lab. Reading time of target parts, i.e. words containing scalar implicatures, will be recorded. We presume that in the group with lexical scale, standardized pragmatically mental context would help generate scalar implicature once the scalar word occurs, which will make the participants hope the upcoming words to be informative. Thus if the new input after scalar word is under-informative, more time will be cost for the extra semantic processing. However, in the group with ad hoc scale, scalar implicature may hardly be generated without the support from fixed mental context of scale. Thus, whether the new input is informative or not does not matter at all, and the reading time of target parts will be the same in informative and under-informative utterances. People’s mind may be a dynamic system, in which lots of factors would co-occur. If Katsos’ experimental result is reliable, will it shed light on the interplay of default accounts and context factors in scalar implicature processing? We might be able to assume, based on our experiments, that one single dominant processing paradigm may not be plausible. Furthermore, in the processing of scalar implicature, the semantic interpretation and the pragmatic interpretation may be made in a dynamic interplay in the mind. As to the lexical scale, the pragmatic reading may prevail over the semantic reading because of its greater exposure in daily language use, which may also lead the possible default or standardized paradigm override the role of context. However, those objects in ad hoc scale are not usually treated as scalar membership in mental context, and thus lexical-semantic association of the objects may prevent their pragmatic reading from generating scalar implicature. Only when the sufficient contextual factors are highlighted, can the pragmatic reading get privilege and generate scalar implicature.

Keywords: scalar implicature, ad hoc scale, dynamic interplay, default account, Mandarin Chinese processing

Procedia PDF Downloads 328
19959 Two-Stage Anaerobic Digester for Biogas Production from Sewage Sludge: A Case Study in One of Kuwait’s Wastewater Treatment Plant

Authors: Abdullah Almatouq, Abdulla Abusam, Hussain Hussain, Mishari Khajah, Hussain Abdullah, Rashed Al-Yaseen, Mariam Al-Jumaa, Farah Al-Ajeel, Mohammad Aljassam

Abstract:

Due to the high demand for energy from unsustainable resources in Kuwait, the Kuwaiti government has focused recently on using sustainable resources for energy, such as solar and wind energy. In addition, sludge which is generated as a by-product of physical, chemical, and biological processes during wastewater treatment, can be used as a substrate to generate energy through anaerobic digestion. Kuwait’s wastewater treatment plants produce more than 1.7 million m3 of sludge per year, and this volume is accumulated in the treatment plants without any treatment. Therefore, a pilot-scale (3 m3) two-stage anaerobic digester was constructed in one of the largest treatment plants in Kuwait. The reactor was operated in batch mode, and the hydraulic retention time varied between 14 – 27 days. The main of this study is to evaluate the technical feasibility of a two-stage anaerobic digester for sludge treatability and energy generation in Kuwait. The anaerobic digester achieved a total biogas production of 37 m3, and the highest value of daily biogas production was 0.4 m3/day. The methane content ranged between 50 % and 66 %, and the other gases were as follows: CO2 20 %, H2S 13 %, and 1 % O2. The generated biogas was used on-site for cooking and lighting. In some batches, low C/N was noticed, and that lead to maintaining the concentration of CH4 between 50%-55%. In conclusion, an anaerobic digester is an environmentally friendly technology that can be applied in Kuwait, and the obtained results support the scale-up of the process in all the treatment plants.

Keywords: wastewater, metahne, biogas production potential, anaerobic digestion

Procedia PDF Downloads 121
19958 Ultra-Sensitive and Real Time Detection of ZnO NW Using QCM

Authors: Juneseok You, Kuewhan Jang, Chanho Park, Jaeyeong Choi, Hyunjun Park, Sehyun Shin, Changsoo Han, Sungsoo Na

Abstract:

Nanomaterials occur toxic effects to human being or ecological systems. Some sensors have been developed to detect toxic materials and the standard for toxic materials has been established. Zinc oxide nanowire (ZnO NW) is known for toxic material. By ionizing in cell body, ionized Zn ions are overexposed to cell components, which cause critical damage or death. In this paper, we detected ZnO NW in water using QCM (Quartz Crystal Microbalance) and ssDNA (single strand DNA). We achieved 30 minutes of response time for real time detection and 100 pg/mL of limit of detection (LOD).

Keywords: zinc oxide nanowire, QCM, ssDNA, toxic material, biosensor

Procedia PDF Downloads 435
19957 Effect of Interaction between Colchicine Concentrations and Treatment Time Duration on the Percentage of Chromosome Polyploidy of Crepis capillaris (with and without 2B Chromosome) in vitro Culture

Authors: Payman A. A. Zibari, Mosleh M. S. Duhoky

Abstract:

These experiments were conducted at Tissue Culture Laboratory/ Faculty of Agriculture / University of Duhok during the period from January 2011 to May 2013. The objectives of this study were to study the effects of interaction between colchcine concentrations and treatment time duration of Creps capilaris (with and without 2B chromosome) on chromosome polyploidy during fifteen passages until regeneration of plants from the callus. Data showed that high percentage of chromosome polyploidy approximately can be obtained from high concentration of colchicin and long time of duration.

Keywords: polyploidy, Crepis capilaris, colchicine, B chromosome

Procedia PDF Downloads 196
19956 Microwave Imaging by Application of Information Theory Criteria in MUSIC Algorithm

Authors: Majid Pourahmadi

Abstract:

The performance of time-reversal MUSIC algorithm will be dramatically degrades in presence of strong noise and multiple scattering (i.e. when scatterers are close to each other). This is due to error in determining the number of scatterers. The present paper provides a new approach to alleviate such a problem using an information theoretic criterion referred as minimum description length (MDL). The merits of the novel approach are confirmed by the numerical examples. The results indicate the time-reversal MUSIC yields accurate estimate of the target locations with considerable noise and multiple scattering in the received signals.

Keywords: microwave imaging, time reversal, MUSIC algorithm, minimum description length (MDL)

Procedia PDF Downloads 343
19955 Starchy Wastewater as Raw Material for Biohydrogen Production by Dark Fermentation: A Review

Authors: Tami A. Ulhiza, Noor I. M. Puad, Azlin S. Azmi, Mohd. I. A. Malek

Abstract:

High amount of chemical oxygen demand (COD) in starchy waste can be harmful to the environment. In common practice, starch processing wastewater is discharged to the river without proper treatment. However, starchy waste still contains complex sugars and organic acids. By the right pretreatment method, the complex sugar can be hydrolyzed into more readily digestible sugars which can be utilized to be converted into more valuable products. At the same time, the global demand of energy is inevitable. The continuous usage of fossil fuel as the main source of energy can lead to energy scarcity. Hydrogen is a renewable form of energy which can be an alternative energy in the future. Moreover, hydrogen is clean and carries the highest energy compared to other fuels. Biohydrogen produced from waste has significant advantages over chemical methods. One of the major problems in biohydrogen production is the raw material cost. The carbohydrate-rich starchy wastes such as tapioca, maize, wheat, potato, and sago wastes is a promising candidate to be used as a substrate in producing biohydrogen. The utilization of those wastes for biohydrogen production can provide cheap energy generation with simultaneous waste treatment. Therefore this paper aims to review variety source of starchy wastes that has been widely used to synthesize biohydrogen. The scope includes the source of waste, the performance in yielding hydrogen, the pretreatment method and the type of culture that is suitable for starchy waste.

Keywords: biohydrogen, dark fermentation, renewable energy, starchy waste

Procedia PDF Downloads 226
19954 Power Generation and Treatment potential of Microbial Fuel Cell (MFC) from Landfill Leachate

Authors: Beenish Saba, Ann D. Christy

Abstract:

Modern day municipal solid waste landfills are operated and controlled to protect the environment from contaminants during the biological stabilization and degradation of the solid waste. They are equipped with liners, caps, gas and leachate collection systems. Landfill gas is passively or actively collected and can be used as bio fuel after necessary purification, but leachate treatment is the more difficult challenge. Leachate, if not recirculated in a bioreactor landfill system, is typically transported to a local wastewater treatment plant for treatment. These plants are designed for sewage treatment, and often charge additional fees for higher strength wastewaters such as leachate if they accept them at all. Different biological, chemical, physical and integrated techniques can be used to treat the leachate. Treating that leachate with simultaneous power production using microbial fuel cells (MFC) technology has been a recent innovation, reported its application in its earliest starting phase. High chemical oxygen demand (COD), ionic strength and salt concentration are some of the characteristics which make leachate an excellent substrate for power production in MFCs. Different materials of electrodes, microbial communities, carbon co-substrates and temperature conditions are some factors that can be optimized to achieve simultaneous power production and treatment. The advantage of the MFC is its dual functionality but lower power production and high costs are the hurdles in its commercialization and more widespread application. The studies so far suggest that landfill leachate MFCs can produce 1.8 mW/m2 with 79% COD removal, while amendment with food leachate or domestic wastewater can increase performance up to 18W/m3 with 90% COD removal. The columbic efficiency is reported to vary between 2-60%. However efforts towards biofilm optimization, efficient electron transport system studies and use of genetic tools can increase the efficiency of the MFC and can determine its future potential in treating landfill leachate.

Keywords: microbial fuel cell, landfill leachate, power generation, MFC

Procedia PDF Downloads 321