Search results for: dielectric methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15162

Search results for: dielectric methods

14022 Preparation and Characterization of Nanometric Ni-Zn Ferrite via Different Methods

Authors: Ebtesam. E. Ateia, L. M. Salah, A. H. El-Bassuony

Abstract:

The aim of the presented study was the possibility of developing a nanosized material with enhanced structural properties that was suitable for many applications. Nanostructure ferrite of composition Ni0.5 Zn0.5 Cr0.1 Fe1.9 O4 were prepared by sol–gel, co-precipitation, citrate-gel, flash and oxalate precursor methods. The Structural and micro structural analysis of the investigated samples were carried out. It was observed that the lattice parameter of cubic spinel was constant, and the positions of both tetrahedral and the octahedral bands had a fixed position. The values of the lattice parameter had a significant role in determining the stoichiometric cation distribution of the composition.The average crystalline sizes of the investigated samples were from 16.4 to 69 nm. Discussion was made on the basis of a comparison of average crystallite size of the investigated samples, indicating that the co-precipitation method was the the effective one in producing small crystallite sized samples.

Keywords: chemical preparation, ferrite, grain size, nanocomposites, sol-gel

Procedia PDF Downloads 320
14021 Cyclic Etching Process Using Inductively Coupled Plasma for Polycrystalline Diamond on AlGaN/GaN Heterostructure

Authors: Haolun Sun, Ping Wang, Mei Wu, Meng Zhang, Bin Hou, Ling Yang, Xiaohua Ma, Yue Hao

Abstract:

Gallium nitride (GaN) is an attractive material for next-generation power devices. It is noted that the performance of GaN-based high electron mobility transistors (HEMTs) is always limited by the self-heating effect. In response to the problem, integrating devices with polycrystalline diamond (PCD) has been demonstrated to be an efficient way to alleviate the self-heating issue of the GaN-based HEMTs. Among all the heat-spreading schemes, using PCD to cap the epitaxial layer before the HEMTs process is one of the most effective schemes. Now, the mainstream method of fabricating the PCD-capped HEMTs is to deposit the diamond heat-spreading layer on the AlGaN surface, which is covered by a thin nucleation dielectric/passivation layer. To achieve the pattern etching of the diamond heat spreader and device preparation, we selected SiN as the hard mask for diamond etching, which was deposited by plasma-enhanced chemical vapor deposition (PECVD). The conventional diamond etching method first uses F-based etching to remove the SiN from the special window region, followed by using O₂/Ar plasma to etch the diamond. However, the results of the scanning electron microscope (SEM) and focused ion beam microscopy (FIB) show that there are lots of diamond pillars on the etched diamond surface. Through our study, we found that it was caused by the high roughness of the diamond surface and the existence of the overlap between the diamond grains, which makes the etching of the SiN hard mask insufficient and leaves micro-masks on the diamond surface. Thus, a cyclic etching method was proposed to solve the problem of the residual SiN, which was left in the F-based etching. We used F-based etching during the first step to remove the SiN hard mask in the specific region; then, the O₂/Ar plasma was introduced to etch the diamond in the corresponding region. These two etching steps were set as one cycle. After the first cycle, we further used cyclic etching to clear the pillars, in which the F-based etching was used to remove the residual SiN, and then the O₂/Ar plasma was used to etch the diamond. Whether to take the next cyclic etching depends on whether there are still SiN micro-masks left. By using this method, we eventually achieved the self-terminated etching of the diamond and the smooth surface after the etching. These results demonstrate that the cyclic etching method can be successfully applied to the integrated preparation of polycrystalline diamond thin films and GaN HEMTs.

Keywords: AlGaN/GaN heterojunction, O₂/Ar plasma, cyclic etching, polycrystalline diamond

Procedia PDF Downloads 101
14020 Finite-Sum Optimization: Adaptivity to Smoothness and Loopless Variance Reduction

Authors: Bastien Batardière, Joon Kwon

Abstract:

For finite-sum optimization, variance-reduced gradient methods (VR) compute at each iteration the gradient of a single function (or of a mini-batch), and yet achieve faster convergence than SGD thanks to a carefully crafted lower-variance stochastic gradient estimator that reuses past gradients. Another important line of research of the past decade in continuous optimization is the adaptive algorithms such as AdaGrad, that dynamically adjust the (possibly coordinate-wise) learning rate to past gradients and thereby adapt to the geometry of the objective function. Variants such as RMSprop and Adam demonstrate outstanding practical performance that have contributed to the success of deep learning. In this work, we present AdaLVR, which combines the AdaGrad algorithm with loopless variance-reduced gradient estimators such as SAGA or L-SVRG that benefits from a straightforward construction and a streamlined analysis. We assess that AdaLVR inherits both good convergence properties from VR methods and the adaptive nature of AdaGrad: in the case of L-smooth convex functions we establish a gradient complexity of O(n + (L + √ nL)/ε) without prior knowledge of L. Numerical experiments demonstrate the superiority of AdaLVR over state-of-the-art methods. Moreover, we empirically show that the RMSprop and Adam algorithm combined with variance-reduced gradients estimators achieve even faster convergence.

Keywords: convex optimization, variance reduction, adaptive algorithms, loopless

Procedia PDF Downloads 49
14019 Application the Queuing Theory in the Warehouse Optimization

Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova

Abstract:

The aim of optimization of store management is not only designing the situation of store management itself including its equipment, technology and operation. In optimization of store management we need to consider also synchronizing of technological, transport, store and service operations throughout the whole process of logistic chain in such a way that a natural flow of material from provider to consumer will be achieved the shortest possible way, in the shortest possible time in requested quality and quantity and with minimum costs. The paper deals with the application of the queuing theory for optimization of warehouse processes. The first part refers to common information about the problematic of warehousing and using mathematical methods for logistics chains optimization. The second part refers to preparing a model of a warehouse within queuing theory. The conclusion of the paper includes two examples of using queuing theory in praxis.

Keywords: queuing theory, logistics system, mathematical methods, warehouse optimization

Procedia PDF Downloads 574
14018 Performance Evaluation of Contemporary Classifiers for Automatic Detection of Epileptic EEG

Authors: K. E. Ch. Vidyasagar, M. Moghavvemi, T. S. S. T. Prabhat

Abstract:

Epilepsy is a global problem, and with seizures eluding even the smartest of diagnoses a requirement for automatic detection of the same using electroencephalogram (EEG) would have a huge impact in diagnosis of the disorder. Among a multitude of methods for automatic epilepsy detection, one should find the best method out, based on accuracy, for classification. This paper reasons out, and rationalizes, the best methods for classification. Accuracy is based on the classifier, and thus this paper discusses classifiers like quadratic discriminant analysis (QDA), classification and regression tree (CART), support vector machine (SVM), naive Bayes classifier (NBC), linear discriminant analysis (LDA), K-nearest neighbor (KNN) and artificial neural networks (ANN). Results show that ANN is the most accurate of all the above stated classifiers with 97.7% accuracy, 97.25% specificity and 98.28% sensitivity in its merit. This is followed closely by SVM with 1% variation in result. These results would certainly help researchers choose the best classifier for detection of epilepsy.

Keywords: classification, seizure, KNN, SVM, LDA, ANN, epilepsy

Procedia PDF Downloads 498
14017 Screening and Optimization of Pretreatments for Rice Straw and Their Utilization for Bioethanol Production Using Developed Yeast Strain

Authors: Ganesh Dattatraya Saratale, Min Kyu Oh

Abstract:

Rice straw is one of the most abundant lignocellulosic waste materials and its annual production is about 731 Mt in the world. This study treats the subject of effective utilization of this waste biomass for biofuels production. We have showed a comparative assessment of numerous pretreatment strategies for rice straw, comprising of major physical, chemical and physicochemical methods. Among the different methods employed for pretreatment alkaline pretreatment in combination with sodium chlorite/acetic acid delignification found efficient pretreatment with significant improvement in the enzymatic digestibility of rice straw. A cellulase dose of 20 filter paper units (FPU) released a maximum 63.21 g/L of reducing sugar with 94.45% hydrolysis yield and 64.64% glucose yield from rice straw, respectively. The effects of different pretreatment methods on biomass structure and complexity were investigated by FTIR, XRD and SEM analytical techniques. Finally the enzymatic hydrolysate of rice straw was used for ethanol production using developed Saccharomyces cerevisiae SR8. The developed yeast strain enabled efficient fermentation of xylose and glucose and produced higher ethanol production. Thus development of bioethanol production from lignocellulosic waste biomass is generic, applicable methodology and have great implication for using ‘green raw materials’ and producing ‘green products’ much needed today.

Keywords: rice straw, pretreatment, enzymatic hydrolysis, FPU, Saccharomyces cerevisiae SR8, ethanol fermentation

Procedia PDF Downloads 518
14016 Environmental Exposure Assessment among Refuellers at Brussels South Charleroi Airport

Authors: Mostosi C., Stéphenne J., Kempeneers E.

Abstract:

Introduction: Refuellers from Brussels South Charleroi Airport (BSCA) expressed concerns about the risks involved in handling JET-A1 fuel. The HSE Manager of BSCA, in collaboration with the occupational physician and the industrial hygiene unit of the External Service of Occupational Medicine, decided to assess the toxicological exposure of these workers. Materials and methods: Two measurement methods were used. The first was to assay three types of metabolites in urine to highlight the exposure to xylenes, toluene, and benzene in aircraft fuels. Out of 32 refuellers in the department, 26 participated in the sampling, and 23 samples were exploited. The second method targeted the assessment of environmental exposure to certain potentially hazardous substances that refuellers are likely to breathe in work areas at the airport. It was decided to carry out two ambient air measurement campaigns, using static systems on the one hand and, on the other hand, using individual sensors worn by the refuellers at the level of the respiratory tract. Volatile organic compounds and diesel particles were analyzed. Results: Despite the fears that motivated these analyzes, the overall results showed low levels of exposure, far below the existing limit values, both in air quality and in urinary measurements. Conclusion: These results are comparable to a study carried out in several French airports. The staff could be reassured, and then the medical surveillance was modified by the occupational physician. With the aviation development at BSCA, equipment and methods are evolving. Their exposure will have to be reassessed.

Keywords: refuelling, airport, exposure, fuel, occupational health, air quality

Procedia PDF Downloads 68
14015 Novel Aminoglycosides to Target Resistant Pathogens

Authors: Nihar Ranjan, Derrick Watkins, Dev P. Arya

Abstract:

Current methods in the study of antibiotic activity of ribosome targeted antibiotics are dependent on cell based bacterial inhibition assays or various forms of ribosomal binding assays. These assays are typically independent of each other and little direct correlation between the ribosomal binding and bacterial inhibition is established with the complementary assay. We have developed novel high-throughput capable assays for ribosome targeted drug discovery. One such assay examines the compounds ability to bind to a model ribosomal RNA A-site. We have also coupled this assay to other functional orthogonal assays. Such analysis can provide valuable understanding of the relationships between two complementary drug screening methods and could be used as standard analysis to correlate the affinity of a compound for its target and the effect the compound has on a cell.

Keywords: bacterial resistance, aminoglycosides, screening, drugs

Procedia PDF Downloads 351
14014 Hybrid Approach for the Min-Interference Frequency Assignment

Authors: F. Debbat, F. T. Bendimerad

Abstract:

The efficient frequency assignment for radio communications becomes more and more crucial when developing new information technologies and their applications. It is consists in defining an assignment of frequencies to radio links, to be established between base stations and mobile transmitters. Separation of the frequencies assigned is necessary to avoid interference. However, unnecessary separation causes an excess requirement for spectrum, the cost of which may be very high. This problem is NP-hard problem which cannot be solved by conventional optimization algorithms. It is therefore necessary to use metaheuristic methods to solve it. This paper proposes Hybrid approach based on simulated annealing (SA) and Tabu Search (TS) methods to solve this problem. Computational results, obtained on a number of standard problem instances, testify the effectiveness of the proposed approach.

Keywords: cellular mobile communication, frequency assignment problem, optimization, tabu search, simulated annealing

Procedia PDF Downloads 364
14013 Circular Labour Migration and Its Consequences in Georgia

Authors: Manana Lobzhanidze

Abstract:

Introduction: The paper will argue that labor migration is the most important problem Georgia faces today. The structure of labor migration by age and gender of Georgia is analyzed. The main driving factors of circular labor migration during the last ten years are identified. While studying migration, it is necessary to discuss the interconnection of economic, social, and demographic features, also taking into consideration the policy of state regulations in terms of education and professional training. Methodology: Different research methods are applied in the presented paper: statistical, such as selection, grouping, observation, trend, and qualitative research methods, namely; analysis, synthesis, induction, deduction, comparison ones. Main Findings: Labour migrants are filling the labor market as a low salary worker. The main positive feedback of migration from developing countries is poverty eradication, but this process is accompanied by problems, such as 'Brain Drain'. The country loses an important part of its intellectual potential, and it is invested by households or state itself. Conclusions: Labor migration is characterized to be temporary, but socio-economic problems of the country often push the labor migration in the direction of longterm and illegal migration. Countries with developed economies try to stricter migration policy and fight illegal migration with different methods; circular migration helps solve this problem. Conclusions and recommendations are included about circular labor migration consequences in Georgia and its influence on the reduction of unemployment level.

Keywords: migration, circular labor migration, labor migration employment, unemployment

Procedia PDF Downloads 153
14012 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments

Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda

Abstract:

In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.

Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction

Procedia PDF Downloads 495
14011 Generating Product Description with Generative Pre-Trained Transformer 2

Authors: Minh-Thuan Nguyen, Phuong-Thai Nguyen, Van-Vinh Nguyen, Quang-Minh Nguyen

Abstract:

Research on automatically generating descriptions for e-commerce products is gaining increasing attention in recent years. However, the generated descriptions of their systems are often less informative and attractive because of lacking training datasets or the limitation of these approaches, which often use templates or statistical methods. In this paper, we explore a method to generate production descriptions by using the GPT-2 model. In addition, we apply text paraphrasing and task-adaptive pretraining techniques to improve the qualify of descriptions generated from the GPT-2 model. Experiment results show that our models outperform the baseline model through automatic evaluation and human evaluation. Especially, our methods achieve a promising result not only on the seen test set but also in the unseen test set.

Keywords: GPT-2, product description, transformer, task-adaptive, language model, pretraining

Procedia PDF Downloads 179
14010 Forecasting Residential Water Consumption in Hamilton, New Zealand

Authors: Farnaz Farhangi

Abstract:

Many people in New Zealand believe that the access to water is inexhaustible, and it comes from a history of virtually unrestricted access to it. For the region like Hamilton which is one of New Zealand’s fastest growing cities, it is crucial for policy makers to know about the future water consumption and implementation of rules and regulation such as universal water metering. Hamilton residents use water freely and they do not have any idea about how much water they use. Hence, one of proposed objectives of this research is focusing on forecasting water consumption using different methods. Residential water consumption time series exhibits seasonal and trend variations. Seasonality is the pattern caused by repeating events such as weather conditions in summer and winter, public holidays, etc. The problem with this seasonal fluctuation is that, it dominates other time series components and makes difficulties in determining other variations (such as educational campaign’s effect, regulation, etc.) in time series. Apart from seasonality, a stochastic trend is also combined with seasonality and makes different effects on results of forecasting. According to the forecasting literature, preprocessing (de-trending and de-seasonalization) is essential to have more performed forecasting results, while some other researchers mention that seasonally non-adjusted data should be used. Hence, I answer the question that is pre-processing essential? A wide range of forecasting methods exists with different pros and cons. In this research, I apply double seasonal ARIMA and Artificial Neural Network (ANN), considering diverse elements such as seasonality and calendar effects (public and school holidays) and combine their results to find the best predicted values. My hypothesis is the examination the results of combined method (hybrid model) and individual methods and comparing the accuracy and robustness. In order to use ARIMA, the data should be stationary. Also, ANN has successful forecasting applications in terms of forecasting seasonal and trend time series. Using a hybrid model is a way to improve the accuracy of the methods. Due to the fact that water demand is dominated by different seasonality, in order to find their sensitivity to weather conditions or calendar effects or other seasonal patterns, I combine different methods. The advantage of this combination is reduction of errors by averaging of each individual model. It is also useful when we are not sure about the accuracy of each forecasting model and it can ease the problem of model selection. Using daily residential water consumption data from January 2000 to July 2015 in Hamilton, I indicate how prediction by different methods varies. ANN has more accurate forecasting results than other method and preprocessing is essential when we use seasonal time series. Using hybrid model reduces forecasting average errors and increases the performance.

Keywords: artificial neural network (ANN), double seasonal ARIMA, forecasting, hybrid model

Procedia PDF Downloads 317
14009 Statistical Analysis to Select Evacuation Route

Authors: Zaky Musyarof, Dwi Yono Sutarto, Dwima Rindy Atika, R. B. Fajriya Hakim

Abstract:

Each country should be responsible for the safety of people, especially responsible for the safety of people living in disaster-prone areas. One of those services is provides evacuation route for them. But all this time, the selection of evacuation route is seem doesn’t well organized, it could be seen that when a disaster happen, there will be many accumulation of people on the steps of evacuation route. That condition is dangerous to people because hampers evacuation process. By some methods in Statistical analysis, author tries to give a suggestion how to prepare evacuation route which is organized and based on people habit. Those methods are association rules, sequential pattern mining, hierarchical cluster analysis and fuzzy logic.

Keywords: association rules, sequential pattern mining, cluster analysis, fuzzy logic, evacuation route

Procedia PDF Downloads 484
14008 Challenges in the Material and Action-Resistance Factor Design for Embedded Retaining Wall Limit State Analysis

Authors: Kreso Ivandic, Filip Dodigovic, Damir Stuhec

Abstract:

The paper deals with the proposed 'Material' and 'Action-resistance factor' design methods in designing the embedded retaining walls. The parametric analysis of evaluating the differences of the output values mutually and compared with classic approach computation was performed. There is a challenge with the criteria for choosing the proposed calculation design methods in Eurocode 7 with respect to current technical regulations and regular engineering practice. The basic criterion for applying a particular design method is to ensure minimum an equal degree of reliability in relation to the current practice. The procedure of combining the relevant partial coefficients according to design methods was carried out. The use of mentioned partial coefficients should result in the same level of safety, regardless of load combinations, material characteristics and problem geometry. This proposed approach of the partial coefficients related to the material and/or action-resistance should aimed at building a bridge between calculations used so far and pure probability analysis. The measure to compare the results was to determine an equivalent safety factor for each analysis. The results show a visible wide span of equivalent values of the classic safety factors.

Keywords: action-resistance factor design, classic approach, embedded retaining wall, Eurocode 7, limit states, material factor design

Procedia PDF Downloads 219
14007 Comparison of Cyclone Design Methods for Removal of Fine Particles from Plasma Generated Syngas

Authors: Mareli Hattingh, I. Jaco Van der Walt, Frans B. Waanders

Abstract:

A waste-to-energy plasma system was designed by Necsa for commercial use to create electricity from unsorted municipal waste. Fly ash particles must be removed from the syngas stream at operating temperatures of 1000 °C and recycled back into the reactor for complete combustion. A 2D2D high efficiency cyclone separator was chosen for this purpose. During this study, two cyclone design methods were explored: The Classic Empirical Method (smaller cyclone) and the Flow Characteristics Method (larger cyclone). These designs were optimized with regard to efficiency, so as to remove at minimum 90% of the fly ash particles of average size 10 μm by 50 μm. Wood was used as feed source at a concentration of 20 g/m3 syngas. The two designs were then compared at room temperature, using Perspex test units and three feed gases of different densities, namely nitrogen, helium and air. System conditions were imitated by adapting the gas feed velocity and particle load for each gas respectively. Helium, the least dense of the three gases, would simulate higher temperatures, whereas air, the densest gas, simulates a lower temperature. The average cyclone efficiencies ranged between 94.96% and 98.37%, reaching up to 99.89% in individual runs. The lowest efficiency attained was 94.00%. Furthermore, the design of the smaller cyclone proved to be more robust, while the larger cyclone demonstrated a stronger correlation between its separation efficiency and the feed temperatures. The larger cyclone can be assumed to achieve slightly higher efficiencies at elevated temperatures. However, both design methods led to good designs. At room temperature, the difference in efficiency between the two cyclones was almost negligible. At higher temperatures, however, these general tendencies are expected to be amplified so that the difference between the two design methods will become more obvious. Though the design specifications were met for both designs, the smaller cyclone is recommended as default particle separator for the plasma system due to its robust nature.

Keywords: Cyclone, design, plasma, renewable energy, solid separation, waste processing

Procedia PDF Downloads 196
14006 Supercritical CO2 Extraction of Cymbopogon martini Essential Oil and Comparison of Its Composition with Traditionally Extracted Oils

Authors: Aarti Singh, Anees Ahmad

Abstract:

Essential oil was extracted from lemon grass (Cymbopogon martini) with supercritical carbondioxide (SC-CO2) at pressure of 140 bar and temperature of 55 °C and CO2 flow rate of 8 gmin-1, and its composition and yield were compared with other conventional extraction methods of oil, HD (Hydrodistillation), SE (Solvent Extraction), UAE (Ultrasound Assisted Extraction). SC-CO2 extraction is a green and sustainable extraction technique. Each oil was analysed by GC-MS, the major constituents were neral (44%), Z-citral (43%), geranial (27%), caryophyllene (4.6%) and linalool (1%). The essential oil of lemon grass is valued for its neral and citral concentration. The oil obtained by supercritical carbon-dioxide extraction contained maximum concentration of neral (55.05%) whereas ultrasonication extracted oil contained minimum content (5.24%) and it was absent in solvent extracted oil. The antioxidant properties have been assessed by DPPH and superoxide scavenging methods.

Keywords: cymbopogon martini, essential oil, FT-IR, GC-MS, HPTLC, SC-CO2

Procedia PDF Downloads 442
14005 Outsourcing the Front End of Innovation

Authors: B. Likar, K. Širok

Abstract:

The paper presents a new method for efficient innovation process management. Even though the innovation management methods, tools and knowledge are well established and documented in literature, most of the companies still do not manage it efficiently. Especially in SMEs the front end of innovation - problem identification, idea creation and selection - is often not optimally performed. Our eMIPS methodology represents a sort of "umbrella methodology"- a well-defined set of procedures, which can be dynamically adapted to the concrete case in a company. In daily practice, various methods (e.g. for problem identification and idea creation) can be applied, depending on the company's needs. It is based on the proactive involvement of the company's employees supported by the appropriate methodology and external experts. The presented phases are performed via a mixture of face-to-face activities (workshops) and online (eLearning) activities taking place in eLearning Moodle environment and using other e-communication channels. One part of the outcomes is an identified set of opportunities and concrete solutions ready for implementation. The other also very important result is connected to innovation competences for the participating employees related with concrete tools and methods for idea management. In addition, the employees get a strong experience for dynamic, efficient and solution oriented managing of the invention process. The eMIPS also represents a way of establishing or improving the innovation culture in the organization. The first results in a pilot company showed excellent results regarding the motivation of participants and also as to the results achieved.

Keywords: creativity, distance learning, front end, innovation, problem

Procedia PDF Downloads 315
14004 Economic Valuation of Environmental Services Sustained by Flamboyant Park in Goiania-Go, Brazil

Authors: Brenda R. Berca, Jessica S. Vieira, Lucas G. Candido, Matheus C. Ferreira, Paulo S. A. Lopes Filho, Rafaella O. Baracho

Abstract:

This study aims to estimate the economic value environmental services sustained by Flamboyant Lourival Louza Municipal Park in Goiânia, Goiás, Brazil. The Flamboyant Park is one of the most relevant urban parks, and it is located near a stadium, a shopping center, and two supercenters. In order to define the methods used for the valuation of Flamboyant Park, the first step was carrying out bibliographical research with the view to better understand which method is most feasible to valuate the Park. Thus, the following direct methods were selected: travel cost, hedonic pricing, and contingent valuation. In addition, an indirect method (replacement cost) was applied at Flamboyant Park. The second step was creating and applying two surveys. The first survey aimed at the visitors of the park, addressing socio-economic issues, the use of the Park, as well as its importance and the willingness the visitors, had to pay for its existence. The second survey was destined to the existing trade in the Park, in order to collect data regarding the profits obtained by them. In the end, the characterization of the profile of the visitors and the application of the methods of contingent valuation, travel cost, replacement cost and hedonic pricing were obtained, thus monetarily valuing the various ecosystem services sustained by the park. Some services were not valued due to difficulties encountered during the process.

Keywords: contingent valuation, ecosystem services, economic environmental valuation, hedonic pricing, travel cost

Procedia PDF Downloads 214
14003 Compare Hot Forming and Cold Forming in Rolling Process

Authors: Ali Moarrefzadeh

Abstract:

In metalworking, rolling is a metal forming process in which metal stock is passed through a pair of rolls. Rolling is classified according to the temperature of the metal rolled. If the temperature of the metal is above its recrystallization temperature, then the process is termed as hot rolling. If the temperature of the metal is below its recrystallization temperature, the process is termed as cold rolling. In terms of usage, hot rolling processes more tonnage than any other manufacturing process, and cold rolling processes the most tonnage out of all cold working processes. This article describes the use of advanced tubing inspection NDT methods for boiler and heat exchanger equipment in the petrochemical industry to supplement major turnaround inspections. The methods presented include remote field eddy current, magnetic flux leakage, internal rotary inspection system and eddy current.

Keywords: hot forming, cold forming, metal, rolling, simulation

Procedia PDF Downloads 513
14002 Regularization of Gene Regulatory Networks Perturbed by White Noise

Authors: Ramazan I. Kadiev, Arcady Ponosov

Abstract:

Mathematical models of gene regulatory networks can in many cases be described by ordinary differential equations with switching nonlinearities, where the initial value problem is ill-posed. Several regularization methods are known in the case of deterministic networks, but the presence of stochastic noise leads to several technical difficulties. In the presentation, it is proposed to apply the methods of the stochastic singular perturbation theory going back to Yu. Kabanov and Yu. Pergamentshchikov. This approach is used to regularize the above ill-posed problem, which, e.g., makes it possible to design stable numerical schemes. Several examples are provided in the presentation, which support the efficiency of the suggested analysis. The method can also be of interest in other fields of biomathematics, where differential equations contain switchings, e.g., in neural field models.

Keywords: ill-posed problems, singular perturbation analysis, stochastic differential equations, switching nonlinearities

Procedia PDF Downloads 179
14001 The Colouration of Additive-Manufactured Polymer

Authors: Abisuga Oluwayemisi Adebola, Kerri Akiwowo, Deon de Beer, Kobus Van Der Walt

Abstract:

The convergence of additive manufacturing (AM) and traditional textile dyeing techniques has initiated innovative possibilities for improving the visual application and customization potential of 3D-printed polymer objects. Textile dyeing techniques have progressed to transform fabrics with vibrant colours and complex patterns over centuries. The layer-by-layer deposition characteristic of AM necessitates adaptations in dye application methods to ensure even colour penetration across complex surfaces. Compatibility between dye formulations and polymer matrices influences colour uptake and stability, demanding careful selection and testing of dyes for optimal results. This study investigates the development interaction between these areas, revealing the challenges and opportunities of applying textile dyeing methods to colour 3D-printed polymer materials. The method explores three innovative approaches to colour the 3D-printed polymer object: (a) Additive Manufacturing of a Prototype, (b) the traditional dyebath method, and (c) the contemporary digital sublimation technique. The results show that the layer lines inherent to AM interact with dyes differently and affect the visual outcome compared to traditional textile fibers. Skillful manipulation of textile dyeing methods and dye type used for this research reduced the appearance of these lines to achieve consistency and desirable colour outcomes. In conclusion, integrating textile dyeing techniques into colouring 3D-printed polymer materials connects historical craftsmanship with innovative manufacturing. Overcoming challenges of colour distribution, compatibility, and layer line management requires a holistic approach that blends the technical consistency of AM with the artistic sensitivity of textile dyeing. Hence, applying textile dyeing methods to 3D-printed polymers opens new dimensions of aesthetic and functional possibilities.

Keywords: polymer, 3D-printing, sublimation, textile, dyeing, additive manufacturing

Procedia PDF Downloads 55
14000 The Decline of Verb-Second in the History of English: Combining Historical and Theoretical Explanations for Change

Authors: Sophie Whittle

Abstract:

Prior to present day, English syntax historically exhibited an inconsistent verb-second (V2) rule, which saw the verb move to the second position in the sentence following the fronting of a type of phrase. There was a high amount of variation throughout the history of English with regard to the ordering of subject and verb, and many explanations attempting to account for this variation have been documented in previous literature. However, these attempts have been contradictory, with many accounts positing the effect of previous syntactic changes as the main motivations behind the decline of V2. For instance, morphosyntactic changes, such as the loss of clitics and the loss of empty expletives, have been loosely connected to changes in frequency for the loss of V2. The questions surrounding the development of non-V2 in English have, therefore, yet to be answered. The current paper aims to bring together a number of explanations from different linguistic fields to determine the factors driving the changes in English V2. Using historical corpus-based methods, the study analyses both quantitatively and qualitatively the changes in frequency for the history of V2 in the Old, Middle, and Modern English periods to account for the variation in a range of sentential environments. These methods delve into the study of information structure, prosody and language contact to explain variation within different contexts. The analysis concludes that these factors, in addition to changes within the syntax, are responsible for the position of verb movement. The loss of V2 serves as an exemplar study within the field of historical linguistics, which combines a number of factors in explaining language change in general.

Keywords: corpora, English, language change, mixed-methods, syntax, verb-second

Procedia PDF Downloads 118
13999 Structural Changes Induced in Graphene Oxide Film by Low Energy Ion Beam Irradiation

Authors: Chetna Tyagi, Ambuj Tripathi, Devesh Avasthi

Abstract:

Graphene oxide consists of sp³ hybridization along with sp² hybridization due to the presence of different oxygen-containing functional groups on its edges and basal planes. However, its sp³ / sp² hybridization can be tuned by various methods to utilize it in different applications, like transistors, solar cells and biosensors. Ion beam irradiation can also be one of the methods to optimize sp² and sp³ hybridization ratio for its desirable properties. In this work, graphene oxide films were irradiated with 100 keV Argon ions at different fluences varying from 10¹³ to 10¹⁶ ions/cm². Synchrotron X-ray diffraction measurements showed an increase in crystallinity at the low fluence of 10¹³ ions/cm². Raman spectroscopy performed on irradiated samples determined the defects induced by the ion beam qualitatively. Also, identification of different groups and their removal with different fluences was done using Fourier infrared spectroscopy technique.

Keywords: graphene oxide, ion beam irradiation, spectroscopy, X-ray diffraction

Procedia PDF Downloads 121
13998 Julia-Based Computational Tool for Composite System Reliability Assessment

Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris

Abstract:

The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.

Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow

Procedia PDF Downloads 52
13997 Comparative Survival Rates of Yeasts during Freeze-Drying, Traditional Drying and Spray Drying

Authors: Latifa Hamoudi-Belarbi, L'Hadi Nouri, Khaled Belkacemi

Abstract:

The effect of three methods of drying (traditional drying, freeze-drying and spray-drying) on the survival of concentrated cultures of Geotrichum fragrans and Wickerhamomyces anomalus was studied. The survival of yeast cultures was initially compared immediately after freeze-drying using HES 12%(w/v)+Sucrose 7% (w/v) as protectant, traditional drying in dry rice cakes and finally spray-drying with whey proteins. The survival of G. fragrans and W. anomalus was studied during 4 months of storage at 4°C and 25°C, in the darkness, under vacuum and at 0% relative humidity. The results demonstrated that high survival was obtained using traditional method of preservation in rice cakes (60% for G. fragrans and 65% for W. anomalus) and freeze-drying in (68% for G. fragrans and 74% for W. anomalus). However, poor survival was obtained by spray-drying method in whey protein with 20% for G. fragrans and 29% for W. anomalus. During storage at 25°C, yeast cultures of G. fragrans and W. anomalus preserved by traditional and freeze-drying methods showed no significant loss of viable cells up to 3 months of storage. Spray-dried yeast cultures had the greatest loss of viable count during the 4 months of storage at 25°C. During storage at 4°C, preservation of yeasts cultures using traditional method of preservation provided better survival than freeze-drying. This study demonstrated the effectiveness of the traditional method to preserve yeasts cultures compared to the high cost methods like freeze-drying and spray-drying.

Keywords: freeze-drying, traditional drying, spray drying, yeasts

Procedia PDF Downloads 469
13996 Demetallization of Crude Oil: Comparative Analysis of Deasphalting and Electrochemical Removal Methods of Ni and V

Authors: Nurlan Akhmetov, Abilmansur Yeshmuratov, Aliya Kurbanova, Gulnar Sugurbekova, Murat Baisariyev

Abstract:

Extraction of the vanadium and nickel compounds is complex due to the high stability of porphyrin, nickel is catalytic poison which deactivates catalysis during the catalytic cracking of the oil, while vanadyl is abrasive and valuable metal. Thus, high concentration of the Ni and V in the crude oil makes their removal relevant. Two methods of the demetallization of crude oil were tested, therefore, the present research is conducted for comparative analysis of the deasphalting with organic solvents (cyclohexane, carbon tetrachloride, chloroform) and electrochemical method. Percentage of Ni extraction reached maximum of approximately 55% by using the electrochemical method in electrolysis cell, which was developed for this research and consists of three sections: oil and protonating agent (EtOH) solution between two conducting membranes which divides it from two capsules of 10% sulfuric acid and two graphite electrodes which cover all three parts in electrical circuit. Ions of metals pass through membranes and remain in acid solutions. The best result was obtained in 60 minutes with ethanol to oil ratio 25% to 75% respectively, current fits in to the range from 0.3A to 0.4A, voltage changed from 12.8V to 17.3V. Maximum efficiency of deasphalting, with cyclohexane as the solvent, in Soxhlet extractor was 66.4% for Ni and 51.2% for V. Thus, applying the voltammetry, ICP MS (Inductively coupled plasma mass spectrometry) and AAS (atomic absorption spectroscopy), these mentioned types of metal extraction methods were compared in this paper.

Keywords: electrochemistry, deasphalting of crude oil, demetallization of crude oil, petrolium engineering

Procedia PDF Downloads 215
13995 New Methods to Acquire Grammatical Skills in A Foreign Language

Authors: Indu ray

Abstract:

In today’s digital world the internet is already flooded with information on how to master grammar in a foreign language. It is well known that one cannot master a language without grammar. Grammar is the backbone of any language. Without grammar there would be no structure to help you speak/write or listen/read. Successful communication is only possible if the form and function of linguistic utterances are firmly related to one another. Grammar has its own rules of use to formulate an easier-to-understand language. Like a tool, grammar formulates our thoughts and knowledge in a meaningful way. Every language has its own grammar. With grammar, we can quickly analyze whether there is any action in this text: (Present, past, future). Knowledge of grammar is an important prerequisite for mastering a foreign language. What’s most important is how teachers can make grammar lessons more interesting for students and thus promote grammar skills more successfully. Through this paper, we discuss a few important methods like (Interactive Grammar Exercises between students, Interactive Grammar Exercise between student to teacher, Grammar translation method, Audio -Visual Method, Deductive Method, Inductive Method). This paper is divided into two sections. In the first part, brief definitions and principles of these approaches will be provided. Then the possibility and the case of combination of this approach will be analyzed. In the last section of the paper, I would like to present a survey result conducted at my university on a few methods to quickly learn grammar in Foreign Language. We divided the Grammatical Skills in six Parts. 1.Grammatical Competence 2. Speaking Skills 3. Phonology 4. The syntax and the Semantics 5. Rule 6. Cognitive Function and conducted a survey among students. From our survey results, we can observe that phonology, speaking ability, syntax and semantics can be improved by inductive method, Audio-visual Method, and grammatical translation method, for grammar rules and cognitive functions we should choose IGE (teacher-student) method. and the IGE method (pupil-pupil). The study’s findings revealed, that the teacher delivery Methods should be blend or fusion based on the content of the Grammar.

Keywords: innovative method, grammatical skills, audio-visual, translation

Procedia PDF Downloads 53
13994 Ice Load Measurements on Known Structures Using Image Processing Methods

Authors: Azam Fazelpour, Saeed R. Dehghani, Vlastimil Masek, Yuri S. Muzychka

Abstract:

This study employs a method based on image analyses and structure information to detect accumulated ice on known structures. The icing of marine vessels and offshore structures causes significant reductions in their efficiency and creates unsafe working conditions. Image processing methods are used to measure ice loads automatically. Most image processing methods are developed based on captured image analyses. In this method, ice loads on structures are calculated by defining structure coordinates and processing captured images. A pyramidal structure is designed with nine cylindrical bars as the known structure of experimental setup. Unsymmetrical ice accumulated on the structure in a cold room represents the actual case of experiments. Camera intrinsic and extrinsic parameters are used to define structure coordinates in the image coordinate system according to the camera location and angle. The thresholding method is applied to capture images and detect iced structures in a binary image. The ice thickness of each element is calculated by combining the information from the binary image and the structure coordinate. Averaging ice diameters from different camera views obtains ice thicknesses of structure elements. Comparison between ice load measurements using this method and the actual ice loads shows positive correlations with an acceptable range of error. The method can be applied to complex structures defining structure and camera coordinates.

Keywords: camera calibration, ice detection, ice load measurements, image processing

Procedia PDF Downloads 352
13993 Fitness Action Recognition Based on MediaPipe

Authors: Zixuan Xu, Yichun Lou, Yang Song, Zihuai Lin

Abstract:

MediaPipe is an open-source machine learning computer vision framework that can be ported into a multi-platform environment, which makes it easier to use it to recognize the human activity. Based on this framework, many human recognition systems have been created, but the fundamental issue is the recognition of human behavior and posture. In this paper, two methods are proposed to recognize human gestures based on MediaPipe, the first one uses the Adaptive Boosting algorithm to recognize a series of fitness gestures, and the second one uses the Fast Dynamic Time Warping algorithm to recognize 413 continuous fitness actions. These two methods are also applicable to any human posture movement recognition.

Keywords: computer vision, MediaPipe, adaptive boosting, fast dynamic time warping

Procedia PDF Downloads 93