Search results for: asynchronous input
992 Meteosat Second Generation Image Compression Based on the Radon Transform and Linear Predictive Coding: Comparison and Performance
Authors: Cherifi Mehdi, Lahdir Mourad, Ameur Soltane
Abstract:
Image compression is used to reduce the number of bits required to represent an image. The Meteosat Second Generation satellite (MSG) allows the acquisition of 12 image files every 15 minutes. Which results a large databases sizes. The transform selected in the images compression should contribute to reduce the data representing the images. The Radon transform retrieves the Radon points that represent the sum of the pixels in a given angle for each direction. Linear predictive coding (LPC) with filtering provides a good decorrelation of Radon points using a Predictor constitute by the Symmetric Nearest Neighbor filter (SNN) coefficients, which result losses during decompression. Finally, Run Length Coding (RLC) gives us a high and fixed compression ratio regardless of the input image. In this paper, a novel image compression method based on the Radon transform and linear predictive coding (LPC) for MSG images is proposed. MSG image compression based on the Radon transform and the LPC provides a good compromise between compression and quality of reconstruction. A comparison of our method with other whose two based on DCT and one on DWT bi-orthogonal filtering is evaluated to show the power of the Radon transform in its resistibility against the quantization noise and to evaluate the performance of our method. Evaluation criteria like PSNR and the compression ratio allows showing the efficiency of our method of compression.Keywords: image compression, radon transform, linear predictive coding (LPC), run lengthcoding (RLC), meteosat second generation (MSG)
Procedia PDF Downloads 421991 Tool Wear Monitoring of High Speed Milling Based on Vibratory Signal Processing
Authors: Hadjadj Abdechafik, Kious Mecheri, Ameur Aissa
Abstract:
The objective of this study is to develop a process of treatment of the vibratory signals generated during a horizontal high speed milling process without applying any coolant in order to establish a monitoring system able to improve the machining performance. Thus, many tests were carried out on the horizontal high speed centre (PCI Météor 10), in given cutting conditions, by using a milling cutter with only one insert and measured its frontal wear from its new state that is considered as a reference state until a worn state that is considered as unsuitable for the tool to be used. The results obtained show that the first harmonic follow well the evolution of frontal wear, on another hand a wavelet transform is used for signal processing and is found to be useful for observing the evolution of the wavelet approximations through the cutting tool life. The power and the Root Mean Square (RMS) values of the wavelet transformed signal gave the best results and can be used for tool wear estimation. All this features can constitute the suitable indicators for an effective detection of tool wear and then used for the input parameters of an online monitoring system. Although we noted the remarkable influence of the machining cycle on the quality of measurements by the introduction of a bias on the signal, this phenomenon appears in particular in horizontal milling and in the majority of studies is ignored.Keywords: flank wear, vibration, milling, signal processing, monitoring
Procedia PDF Downloads 598990 Automated Multisensory Data Collection System for Continuous Monitoring of Refrigerating Appliances Recycling Plants
Authors: Georgii Emelianov, Mikhail Polikarpov, Fabian Hübner, Jochen Deuse, Jochen Schiemann
Abstract:
Recycling refrigerating appliances plays a major role in protecting the Earth's atmosphere from ozone depletion and emissions of greenhouse gases. The performance of refrigerator recycling plants in terms of material retention is the subject of strict environmental certifications and is reviewed periodically through specialized audits. The continuous collection of Refrigerator data required for the input-output analysis is still mostly manual, error-prone, and not digitalized. In this paper, we propose an automated data collection system for recycling plants in order to deduce expected material contents in individual end-of-life refrigerating appliances. The system utilizes laser scanner measurements and optical data to extract attributes of individual refrigerators by applying transfer learning with pre-trained vision models and optical character recognition. Based on Recognized features, the system automatically provides material categories and target values of contained material masses, especially foaming and cooling agents. The presented data collection system paves the way for continuous performance monitoring and efficient control of refrigerator recycling plants.Keywords: automation, data collection, performance monitoring, recycling, refrigerators
Procedia PDF Downloads 164989 Visualization of Wave Propagation in Monocoupled System with Effective Negative Stiffness, Effective Negative Mass, and Inertial Amplifier
Authors: Abhigna Bhatt, Arnab Banerjee
Abstract:
A periodic system with only a single coupling degree of freedom is called a monocoupled system. Monocoupled systems with mechanisms like mass in the mass system generates effective negative mass, mass connected with rigid links generates inertial amplification, and spring-mass connected with a rigid link generateseffective negative stiffness. In this paper, the representative unit cell is introduced, considering all three mechanisms combined. Further, the dynamic stiffness matrix of the unit cell is constructed, and the dispersion relation is obtained by applying the Bloch theorem. The frequency response function is also calculated for the finite length of periodic unit cells. Moreover, the input displacement signal is given to the finite length of periodic structure and using inverse Fourier transform to visualize the wave propagation in the time domain. This visualization explains the sudden attenuation in metamaterial due to energy dissipation by an embedded resonator at the resonance frequency. The visualization created for wave propagation is found necessary to understand the insights of physics behind the attenuation characteristics of the system.Keywords: mono coupled system, negative effective mass, negative effective stiffness, inertial amplifier, fourier transform
Procedia PDF Downloads 126988 The Impact of Quality Management System Establishment over the Performance of Public Administration Services in Kosovo
Authors: Ilir Rexhepi, Naim Ismajli
Abstract:
Quality and quality management are key factors of success nowadays. Public sector and quality management in this sector contains many challenges and difficulties, most notably in a new country like Kosovo. This study analyses the process of implementation of quality management system in public administration institutions in this country. The main objective is to show how to set up a quality management system and how does the quality management system setup affect the overall public administration services in Kosovo. This study shows how the efficiency and effectiveness of public institution services/performance is rapidly improving through the establishment and functionalization of Quality Management System. The specific impact of established QMC within the organization has resulted with the identification of mission related processes within the entire system including input identification, the person in charge and the way of conversion to the output of each activity though the interference with other service processes within the system. By giving detailed analyses of all steps of implementation of the Quality Management System, its effect and consequences towards the overall public institution service performance, we try to go one step further, by showing it as a very good example or tool of other public institutions for improving their service performance. Interviews with employees, middle and high level managers including the quality manager and general secretaries are also part of analyses in this paper.Keywords: quality, quality management system, efficiency, public administration institutions
Procedia PDF Downloads 282987 Processing of Input Material as a Way to Improve the Efficiency of the Glass Production Process
Authors: Joanna Rybicka-Łada, Magda Kosmal, Anna Kuśnierz
Abstract:
One of the main problems of the glass industry is the still high consumption of energy needed to produce glass mass, as well as the increase in prices, fuels, and raw materials. Therefore, comprehensive actions are taken to improve the entire production process. The key element of these activities, starting from filling the set to receiving the finished product, is the melting process, whose task is, among others, dissolving the components of the set, removing bubbles from the resulting melt, and obtaining a chemically homogeneous glass melt. This solution avoids dust formation during filling and is available on the market. This process consumes over 90% of the total energy needed in the production process. The processes occurring in the set during its conversion have a significant impact on the further stages and speed of the melting process and, thus, on its overall effectiveness. The speed of the reactions occurring and their course depend on the chemical nature of the raw materials, the degree of their fragmentation, thermal treatment as well as the form of the introduced set. An opportunity to minimize segregation and accelerate the conversion of glass sets may be the development of new technologies for preparing and dosing sets. The previously preferred traditional method of melting the set, based on mixing all glass raw materials together in loose form, can be replaced with a set in a thickened form. The aim of the project was to develop a glass set in a selectively or completely densified form and to examine the influence of set processing on the melting process and the properties of the glass.Keywords: glass, melting process, glass set, raw materials
Procedia PDF Downloads 60986 Aptian Ramp Sedimentation of the Jebel Serdj Massif, North-Central Tunisia, and Sea Level Variations Recorded in Magnetic Susceptibility
Authors: Houda Khaled, Fredj Chaabani, Frederic Boulvain
Abstract:
The Aptian series in north-central Tunisia was studied in detail regarding to lithology, microfacies, and magnetic susceptibility to provide new insights into the paleoenvironmental evolution and sea level changes in the carbonate platform. The study series is about 350 meters thick, and it consists of fives sequences of limestones, separated by four levels of marlstones and marly limestones. Petrographic study leads to the definition of 11 microfacies which are successively recorded along the Serdj section into the outer ramp, mid-ramp, inner ramp and coastal facies associations. The magnetic susceptibility of all samples was measured and compared with the facies and microfacies. There is a clear link between facies and magnetic susceptibility; the distal facies show high values while the proximal areas show lower values. The magnetic susceptibility profile reflects stratigraphic variations in response to relative changes in sea level and input of detrital materials. During the Aptian, kaolinite/illite intensity ratios show high values possibly indicating a warming trend followed then by decreasing values that may indicate a cooling trend. During the Albian, this cooling trend is reverted into humid/warming.Keywords: Aptian, mineralogy, petrology, Serdj massif
Procedia PDF Downloads 359985 Evaluation of Mechanical Properties of Welds Fabricated at a Close Proximity on Offshore Structures
Authors: T. Nakkeran, C. Dhamodharan, Win Myint Soe , Ramasamy Deverajan, M. Ganesh Babu
Abstract:
This manuscript presents the results of an experimental investigation performed to study the material and mechanical properties of two weld joints fabricated within close proximity. The experiment was designed using welded S355 D Z35 with distances between two parallel adjacent weld toes at 8 mm. These distances were less than the distance that has normally been recommended in standards, codes, and specifications. The main idea of the analysis is to determine any significant effects when welding the joints with the close proximity of 8mm using the SAW welding process of the one joint with high heat put and one joint welded with the FCAW welding process and evaluating the destructing and nondestructive testing between the welded joints. Further, we have evaluated the joints with Mechanical Testing for evaluating by performing Tensile test, bend testing, Macrostructure, Microstructure, Hardness test, and Impact testing. After evaluating the final outcome of the result, no significant changes were observed for welding the close proximity of weld of 8mm distance between the joints as compared to the specification minimum distance between the weldments of any design should be 50mm.Keywords: S355 carbon steel, weld proximity, SAW process, FCAW process, heat input, bend test, tensile test, hardness test, impact test, macro and microscopic examinations
Procedia PDF Downloads 98984 JaCoText: A Pretrained Model for Java Code-Text Generation
Authors: Jessica Lopez Espejel, Mahaman Sanoussi Yahaya Alassan, Walid Dahhane, El Hassane Ettifouri
Abstract:
Pretrained transformer-based models have shown high performance in natural language generation tasks. However, a new wave of interest has surged: automatic programming language code generation. This task consists of translating natural language instructions to a source code. Despite the fact that well-known pre-trained models on language generation have achieved good performance in learning programming languages, effort is still needed in automatic code generation. In this paper, we introduce JaCoText, a model based on Transformer neural network. It aims to generate java source code from natural language text. JaCoText leverages the advantages of both natural language and code generation models. More specifically, we study some findings from state of the art and use them to (1) initialize our model from powerful pre-trained models, (2) explore additional pretraining on our java dataset, (3) lead experiments combining the unimodal and bimodal data in training, and (4) scale the input and output length during the fine-tuning of the model. Conducted experiments on CONCODE dataset show that JaCoText achieves new state-of-the-art results.Keywords: java code generation, natural language processing, sequence-to-sequence models, transformer neural networks
Procedia PDF Downloads 285983 The Use of Continuous Improvement Methods to Empower the Osh MS With Leading Key Performance Indicators
Authors: Maha Rashid Al-Azib, Almuzn Qasem Alqathradi, Amal Munir Alshahrani, Bilqis Mohammed Assiri, Ali Almuflih
Abstract:
The Occupational Safety and Health Management System in one of the largest Saudi companies has been experiencing in the last 10 years extensive direct and indirect expenses due to lack of proactive leading indicators and safety leadership effective procedures. And since there are no studies that are associated with this department of safety in the company, this research has been conducted. In this study we used a mixed method approach containing a literature review and experts input, then a qualitative questionnaire provided by Institute for Work and Health related to determining the company’s occupational safety and health management system level out from three levels (Compliance - Improvement - Continuous Learning) and the output regarding the company’s level was in Continuous Learning. After that Deming cycle was employed to create a set of proactive leading indicators and analyzed using the SMART method to make sure of its effectiveness and suitability to the company. The objective of this research is to provide a set of proactive indicators to contribute in making an efficient occupational safety and health management system that has less accidents which results in less expenses. Therefore, we provided the company with a prototype of an APP, designed and empowered with our final results to contribute in supporting decisions making processes.Keywords: proactive leading indicators, OSH MS, safety leadership, accidents reduction
Procedia PDF Downloads 80982 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines
Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.
Abstract:
Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition
Procedia PDF Downloads 574981 Reed: An Approach Towards Quickly Bootstrapping Multilingual Acoustic Models
Authors: Bipasha Sen, Aditya Agarwal
Abstract:
Multilingual automatic speech recognition (ASR) system is a single entity capable of transcribing multiple languages sharing a common phone space. Performance of such a system is highly dependent on the compatibility of the languages. State of the art speech recognition systems are built using sequential architectures based on recurrent neural networks (RNN) limiting the computational parallelization in training. This poses a significant challenge in terms of time taken to bootstrap and validate the compatibility of multiple languages for building a robust multilingual system. Complex architectural choices based on self-attention networks are made to improve the parallelization thereby reducing the training time. In this work, we propose Reed, a simple system based on 1D convolutions which uses very short context to improve the training time. To improve the performance of our system, we use raw time-domain speech signals directly as input. This enables the convolutional layers to learn feature representations rather than relying on handcrafted features such as MFCC. We report improvement on training and inference times by atleast a factor of 4x and 7.4x respectively with comparable WERs against standard RNN based baseline systems on SpeechOcean's multilingual low resource dataset.Keywords: convolutional neural networks, language compatibility, low resource languages, multilingual automatic speech recognition
Procedia PDF Downloads 123980 The Effect of Law on Politics
Authors: Boukrida Rafiq
Abstract:
Democracy is based on the notion that all citizens have the right to participate in the managing of political affairs and that every citizens input is of equal importance. This basic assumption clearly places emphasis on public participation in maintaining a stable democracy. The level of public participation, however is highly contested with many theorists arguing that too much public participation would overwhelm and ultimately cripple democratic systems. On the other hand, others who favor high levels of participation argue that more citizen involvement leads to greater representation. Regardless of these disagreements over the utopian level of participation, there is widespread agreement amongst scholars that, at the very least, some participation is necessary to maintain democratic systems. The ways in which citizens participate vary greatly and depending on the method used, influence political decision making at varying levels. The method of political participation is a key in controlling public influence over political affairs and therefore is also an integral part of maintaining democracy, whether it be "thin" (low levels of participation) or "Robust" (high levels of participation). High levels of participation or "robust" democracy are argued by some theorists to enhance democracy through providing the opportunity for more issues to be represented during decision making. The notion of widespread participation was first advanced by classical theorists.Keywords: assumption clearly places emphasis, ultimately cripple, influence political decision making at varying, classical theorists
Procedia PDF Downloads 460979 Qualitative and Quantitative Traits of Processed Farmed Fish in N. W. Greece
Authors: Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou
Abstract:
The filleting yield and the chemical composition of farmed sea bass (Dicentrarchus labrax); rainbow trout (Oncorynchus mykiss) and meagre (Argyrosomus regius) was investigated in farmed fish in NW Greece. The results provide an estimate of the quantity of fish required to produce one kilogram of fillet weight, an estimation which is required for the operational management of fish processing companies. Furthermore in this work, the ratio of feed input required to produce one kilogram of fish fillet (FFCR) is presented for the first time as a useful indicator of the ecological footprint of consuming farmed fish. The lowest lipid content appeared in meagre (1,7%) and the highest in trout (4,91%). The lowest fillet yield and fillet yield feed conversion ratio (FYFCR) was in meagre (FY=42,17%, FFCR=2,48), the best fillet yield (FY=53,8%) and FYFCR (2,10) was exhibited in farmed rainbow trout. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.Keywords: farmed fish, flesh quality, filleting yield, lipid
Procedia PDF Downloads 309978 Meteorological Risk Assessment for Ships with Fuzzy Logic Designer
Authors: Ismail Karaca, Ridvan Saracoglu, Omer Soner
Abstract:
Fuzzy Logic, an advanced method to support decision-making, is used by various scientists in many disciplines. Fuzzy programming is a product of fuzzy logic, fuzzy rules, and implication. In marine science, fuzzy programming for ships is dramatically increasing together with autonomous ship studies. In this paper, a program to support the decision-making process for ship navigation has been designed. The program is produced in fuzzy logic and rules, by taking the marine accidents and expert opinions into account. After the program was designed, the program was tested by 46 ship accidents reported by the Transportation Safety Investigation Center of Turkey. Wind speed, sea condition, visibility, day/night ratio have been used as input data. They have been converted into a risk factor within the Fuzzy Logic Designer application and fuzzy rules set by marine experts. Finally, the expert's meteorological risk factor for each accident is compared with the program's risk factor, and the error rate was calculated. The main objective of this study is to improve the navigational safety of ships, by using the advance decision support model. According to the study result, fuzzy programming is a robust model that supports safe navigation.Keywords: calculation of risk factor, fuzzy logic, fuzzy programming for ship, safety navigation of ships
Procedia PDF Downloads 189977 Error Analysis in Academic Writing of EFL Learners: A Case Study for Undergraduate Students at Pathein University
Authors: Aye Pa Pa Myo
Abstract:
Writing in English is accounted as a complex process for English as a foreign language learners. Besides, committing errors in writing can be found as an inevitable part of language learners’ writing. Generally, academic writing is quite difficult for most of the students to manage for getting better scores. Students can commit common errors in their writings when they try to write academic writing. Error analysis deals with identifying and detecting the errors and also explains the reason for the occurrence of these errors. In this paper, the researcher has an attempt to examine the common errors of undergraduate students in their academic writings at Pathein University. The purpose of doing this research is to investigate the errors which students usually commit in academic writing and to find out the better ways for correcting these errors in EFL classrooms. In this research, fifty-third-year non-English specialization students attending Pathein University were selected as participants. This research took one month. It was conducted with a mixed methodology method. Two mini-tests were used as research tools. Data were collected with a quantitative research method. Findings from this research pointed that most of the students noticed their common errors after getting the necessary input, and they became more decreased committing these errors after taking mini-test; hence, all findings will be supportive for further researches related to error analysis in academic writing.Keywords: academic writing, error analysis, EFL learners, mini-tests, mixed methodology
Procedia PDF Downloads 132976 Comparison of Automated Zone Design Census Output Areas with Existing Output Areas in South Africa
Authors: T. Mokhele, O. Mutanga, F. Ahmed
Abstract:
South Africa is one of the few countries that have stopped using the same Enumeration Areas (EAs) for census enumeration and dissemination. The advantage of this change is that confidentiality issue could be addressed for census dissemination as the design of geographic unit for collection is mainly to ensure that this unit is covered by one enumerator. The objective of this paper was to evaluate the performance of automated zone design output areas against non-zone design developed geographies using the 2001 census data, and 2011 census to some extent, as the main input. The comparison of the Automated Zone-design Tool (AZTool) census output areas with the Small Area Layers (SALs) and SubPlaces based on confidentiality limit, population distribution, and degree of homogeneity, as well as shape compactness, was undertaken. Further, SPSS was employed for validation of the AZTool output results. The results showed that AZTool developed output areas out-perform the existing official SAL and SubPlaces with regard to minimum population threshold, population distribution and to some extent to homogeneity. Therefore, it was concluded that AZTool program provides a new alternative to the creation of optimised census output areas for dissemination of population census data in South Africa.Keywords: AZTool, enumeration areas, small areal layers, South Africa
Procedia PDF Downloads 184975 Optoelectronic Hardware Architecture for Recurrent Learning Algorithm in Image Processing
Authors: Abdullah Bal, Sevdenur Bal
Abstract:
This paper purposes a new type of hardware application for training of cellular neural networks (CNN) using optical joint transform correlation (JTC) architecture for image feature extraction. CNNs require much more computation during the training stage compare to test process. Since optoelectronic hardware applications offer possibility of parallel high speed processing capability for 2D data processing applications, CNN training algorithm can be realized using Fourier optics technique. JTC employs lens and CCD cameras with laser beam that realize 2D matrix multiplication and summation in the light speed. Therefore, in the each iteration of training, JTC carries more computation burden inherently and the rest of mathematical computation realized digitally. The bipolar data is encoded by phase and summation of correlation operations is realized using multi-object input joint images. Overlapping properties of JTC are then utilized for summation of two cross-correlations which provide less computation possibility for training stage. Phase-only JTC does not require data rearrangement, electronic pre-calculation and strict system alignment. The proposed system can be incorporated simultaneously with various optical image processing or optical pattern recognition techniques just in the same optical system.Keywords: CNN training, image processing, joint transform correlation, optoelectronic hardware
Procedia PDF Downloads 506974 Prediction of Temperature Distribution during Drilling Process Using Artificial Neural Network
Authors: Ali Reza Tahavvor, Saeed Hosseini, Nazli Jowkar, Afshin Karimzadeh Fard
Abstract:
Experimental & numeral study of temperature distribution during milling process, is important in milling quality and tools life aspects. In the present study the milling cross-section temperature is determined by using Artificial Neural Networks (ANN) according to the temperature of certain points of the work piece and the points specifications and the milling rotational speed of the blade. In the present work, at first three-dimensional model of the work piece is provided and then by using the Computational Heat Transfer (CHT) simulations, temperature in different nods of the work piece are specified in steady-state conditions. Results obtained from CHT are used for training and testing the ANN approach. Using reverse engineering and setting the desired x, y, z and the milling rotational speed of the blade as input data to the network, the milling surface temperature determined by neural network is presented as output data. The desired points temperature for different milling blade rotational speed are obtained experimentally and by extrapolation method for the milling surface temperature is obtained and a comparison is performed among the soft programming ANN, CHT results and experimental data and it is observed that ANN soft programming code can be used more efficiently to determine the temperature in a milling process.Keywords: artificial neural networks, milling process, rotational speed, temperature
Procedia PDF Downloads 405973 Public Environmental Investment Analysis of Japan
Authors: K. Y. Chen, H. Chua, C. W. Kan
Abstract:
Japan is a well-developed country but the environmental issues are still a hot issue. In this study, we will analyse how the environmental investment affects the sustainable development in Japan. This paper will first describe the environmental policy of Japan and the effort input by the Japan government. Then, we will collect the yearly environmental data and also information about the environmental investment. Based on the data collected, we try to figure out the relationship between environmental investment and sustainable development in Japan. In addition, we will analyse the SWOT of environmental investment in Japan. Based on the economic information collected, Japan established a sound material-cycle society through changes in business and life styles. A comprehensive legal system for this kind of society was established in Japan. In addition, other supporting measures, such as financial measures, utilization of economic instruments, implementation of research and promotion of education and science and technology, help Japan to cope with the recent environmental challenges. Japan’s excellent environmental technologies changed its socioeconomic system. They are at the highest global standards. This can be reflected by the number of patents registered in Japan which has been on the steady growth. Country by country comparison in the application for patents on environmental technologies also indicates that Japan ranks high in such areas as atmospheric pollution and water quality management, solid waste management and renewable energy. This is a result of the large expenditure invested on research and development.Keywords: Japan, environmental investment, sustainable development, analysis
Procedia PDF Downloads 268972 A Stochastic Model to Predict Earthquake Ground Motion Duration Recorded in Soft Soils Based on Nonlinear Regression
Authors: Issam Aouari, Abdelmalek Abdelhamid
Abstract:
For seismologists, the characterization of seismic demand should include the amplitude and duration of strong shaking in the system. The duration of ground shaking is one of the key parameters in earthquake resistant design of structures. This paper proposes a nonlinear statistical model to estimate earthquake ground motion duration in soft soils using multiple seismicity indicators. Three definitions of ground motion duration proposed by literature have been applied. With a comparative study, we select the most significant definition to use for predict the duration. A stochastic model is presented for the McCann and Shah Method using nonlinear regression analysis based on a data set for moment magnitude, source to site distance and site conditions. The data set applied is taken from PEER strong motion databank and contains shallow earthquakes from different regions in the world; America, Turkey, London, China, Italy, Chili, Mexico...etc. Main emphasis is placed on soft site condition. The predictive relationship has been developed based on 600 records and three input indicators. Results have been compared with others published models. It has been found that the proposed model can predict earthquake ground motion duration in soft soils for different regions and sites conditions.Keywords: duration, earthquake, prediction, regression, soft soil
Procedia PDF Downloads 153971 Numerical Modelling and Soil-structure Interaction Analysis of Rigid Ballast-less and Flexible Ballast-based High-speed Rail Track-embankments Using Software
Authors: Tokirhusen Iqbalbhai Shaikh, M. V. Shah
Abstract:
With an increase in travel demand and a reduction in travel time, high-speed rail (HSR) has been introduced in India. Simplified 3-D finite element modelling is necessary to predict the stability and deformation characteristics of railway embankments and soil structure interaction behaviour under high-speed design requirements for Indian soil conditions. The objective of this study is to analyse the rigid ballast-less and flexible ballast-based high speed rail track embankments for various critical conditions subjected to them, viz. static condition, moving train condition, sudden brake application, and derailment case, using software. The input parameters for the analysis are soil type, thickness of the relevant strata, unit weight, Young’s modulus, Poisson’s ratio, undrained cohesion, friction angle, dilatancy angle, modulus of subgrade reaction, design speed, and other anticipated, relevant data. Eurocode 1, IRS-004(D), IS 1343, IRS specifications, California high-speed rail technical specifications, and the NHSRCL feasibility report will be followed in this study.Keywords: soil structure interaction, high speed rail, numerical modelling, PLAXIS3D
Procedia PDF Downloads 110970 Long Waves Inundating through and around an Array of Circular Cylinders
Authors: Christian Klettner, Ian Eames, Tristan Robinson
Abstract:
Tsunami is characterised by their very long time periods and can have devastating consequences when these inundate through built-up coastal regions as in the 2004 Indian Ocean and 2011 Tohoku Tsunami. This work aims to investigate the effect of these long waves on the flow through and around a group of buildings, which are abstracted to circular cylinders. The research approach used in this study was using experiments and numerical simulations. Large-scale experiments were carried out at HR Wallingford. The novelty of these experiments is (I) the number of bodies present (up to 64), (II) the long wavelength of the input waves (80 seconds) and (III) the width of the tank (4m) which gives the unique opportunity to investigate three length scales, namely the diameter of the building, the diameter of the array and the width of the tank. To complement the experiments, dam break flow past the same arrays is investigated using three-dimensional numerical simulations in OpenFOAM. Dam break flow was chosen as it is often used as a surrogate for the tsunami in previous research and is used here as there are well defined initial conditions and high quality previous experimental data for the case of a single cylinder is available. The focus of this work is to better understand the effect of the solid void fraction on the force and flow through and around the array. New qualitative and quantitative diagnostics are developed and tested to analyse the complex coupled interaction between the cylinders.Keywords: computational fluid dynamics, tsunami, forces, complex geometry
Procedia PDF Downloads 195969 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 194968 Bio-Electro Chemical Catalysis: Redox Interactions, Storm and Waste Water Treatment
Authors: Michael Radwan Omary
Abstract:
Context: This scientific innovation demonstrate organic catalysis engineered media effective desalination of surface and groundwater. The author has developed a technology called “Storm-Water Ions Filtration Treatment” (SWIFTTM) cold reactor modules designed to retrofit typical urban street storm drains or catch basins. SWIFT triggers biochemical redox reactions with water stream-embedded toxic total dissolved solids (TDS) and electrical conductivity (EC). SWIFTTM Catalysts media unlock the sub-molecular bond energy, break down toxic chemical bonds, and neutralize toxic molecules, bacteria and pathogens. Research Aim: This research aims to develop and design lower O&M cost, zero-brine discharge, energy input-free, chemical-free water desalination and disinfection systems. The objective is to provide an effective resilient and sustainable solution to urban storm-water and groundwater decontamination and disinfection. Methodology: We focused on the development of organic, non-chemical, no-plugs, no pumping, non-polymer and non-allergenic approaches for water and waste water desalination and disinfection. SWIFT modules operate by directing the water stream to flow freely through the electrically charged media cold reactor, generating weak interactions with a water-dissolved electrically conductive molecule, resulting in the neutralization of toxic molecules. The system is powered by harvesting sub-molecular bonds embedded in energy. Findings: The SWIFTTM Technology case studies at CSU-CI and CSU-Fresno Water Institute, demonstrated consistently high reduction of all 40 detected waste-water pollutants including pathogens to levels below a state of California Department of Water Resources “Drinking Water Maximum Contaminants Levels”. The technology has proved effective in reducing pollutants such as arsenic, beryllium, mercury, selenium, glyphosate, benzene, and E. coli bacteria. The technology has also been successfully applied to the decontamination of dissolved chemicals, water pathogens, organic compounds and radiological agents. Theoretical Importance: SWIFT technology development, design, engineering, and manufacturing, offer cutting-edge advancement in achieving clean-energy source bio-catalysis media solution, an energy input free water and waste water desalination and disinfection. A significant contribution to institutions and municipalities achieving sustainable, lower cost, zero-brine and zero CO2 discharges clean energy water desalination. Data Collection and Analysis Procedures: The researchers collected data on the performance of the SWIFTTM technology in reducing the levels of various pollutants in water. The data was analyzed by comparing the reduction achieved by the SWIFTTM technology to the Drinking Water Maximum Contaminants Levels set by the state of California. The researchers also conducted live oral presentations to showcase the applications of SWIFTTM technology in storm water capture and decontamination as well as providing clean drinking water during emergencies. Conclusion: The SWIFTTM Technology has demonstrated its capability to effectively reduce pollutants in water and waste water to levels below regulatory standards. The Technology offers a sustainable solution to groundwater and storm-water treatments. Further development and implementation of the SWIFTTM Technology have the potential to treat storm water to be reused as a new source of drinking water and an ambient source of clean and healthy local water for recharge of ground water.Keywords: catalysis, bio electro interactions, water desalination, weak-interactions
Procedia PDF Downloads 67967 High Temperature Properties of Diffusion Brazed Joints of in 939 Ni-Base Superalloy
Authors: Hyunki Kang, Hi Won Jeong
Abstract:
The gas turbine operates for a long period of time under harsh, cyclic conditions of high temperature and pressure, where high turbine inlet temperature (TIT) can range from 1273 to 1873K. Therefore, Ni-base superalloys such as IN738, IN939, Rene 45, Rene 71, Rene 80, Mar M 247, CM 247, and CMSX-4 with excellent mechanical properties and resistance to creep, corrosion and oxidation at high temperatures are indeed used. Among the alloying additions for these alloys, aluminum (Al) and titanium (Ti) form gamma prime and enhance the high-temperature properties. However, when crack-damaged high-temperature turbine components such as blade and vane are repaired by fusion welding, they cause cracks. For example, when arc welding is applied to certain superalloys that contain Al and Ti with more than 3 wt.% and T3.5 wt%, respectively, such as IN738, IN939, Rene 80, Mar M 247, and CM 247, aging cracks occur. Therefore, repair technologies using diffusion brazing, which has less heat input into the base material, are being developed. Analysis of microstructural evolution of the brazed joints with a base metal of IN 939 Ni-base superalloy using brazing different filler metals was also carried out using X-ray diffraction, OEM, SEM-EDS, and EPMA. Stress rupture and high-temperature tensile strength properties were also measured to analyze the effects of different brazing heat cycles. The boron amount in the diffusion-affected zone (DAZ) was decreased towards the base metal and the formation of borides at grain boundaries was detected through EPMA.Keywords: gas turbine, diffusion brazing, superalloy, gas turbine repair
Procedia PDF Downloads 41966 A Test Methodology to Measure the Open-Loop Voltage Gain of an Operational Amplifier
Authors: Maninder Kaur Gill, Alpana Agarwal
Abstract:
It is practically not feasible to measure the open-loop voltage gain of the operational amplifier in the open loop configuration. It is because the open-loop voltage gain of the operational amplifier is very large. In order to avoid the saturation of the output voltage, a very small input should be given to operational amplifier which is not possible to be measured practically by a digital multimeter. A test circuit for measurement of open loop voltage gain of an operational amplifier has been proposed and verified using simulation tools as well as by experimental methods on breadboard. The main advantage of this test circuit is that it is simple, fast, accurate, cost effective, and easy to handle even on a breadboard. The test circuit requires only the device under test (DUT) along with resistors. This circuit has been tested for measurement of open loop voltage gain for different operational amplifiers. The underlying goal is to design testable circuits for various analog devices that are simple to realize in VLSI systems, giving accurate results and without changing the characteristics of the original system. The DUTs used are LM741CN and UA741CP. For LM741CN, the simulated gain and experimentally measured gain (average) are calculated as 89.71 dB and 87.71 dB, respectively. For UA741CP, the simulated gain and experimentally measured gain (average) are calculated as 101.15 dB and 105.15 dB, respectively. These values are found to be close to the datasheet values.Keywords: Device Under Test (DUT), open loop voltage gain, operational amplifier, test circuit
Procedia PDF Downloads 447965 Automation of AAA Game Development Using AI
Authors: Branden Heng, Harsheni Siddharthan, Allison Tseng, Paul Toprac, Sarah Abraham, Etienne Vouga
Abstract:
The goal of this project was to evaluate and document the capabilities and limitations of AI tools for empowering small teams to create high-budget, high-profile (AAA) 3D games typically developed by large studios. Two teams of novice game developers attempted to create two different games using AI and Unreal Engine 5.3. First, the teams evaluated 60 AI art, design, sound, and programming tools by considering their capability, ease of use, cost, and license restrictions. Then, the teams used a shortlist of 12 AI tools for game development. During this process, the following tools were found to be the most productive: (i) ChatGPT 4.0 for both game and narrative concepts and documentation; (ii) Dall-E 3 and OpenArt for concept art; (iii) Beatoven for music drafting; (iv) ChatGPT 4.0 and Github Copilot for generating simple code and to complement human-made tutorials as an additional learning resource. While current generative AI may appear impressive at first glance, the assets they produce fall short of AAA industry standards. Generative AI tools are helpful when brainstorming ideas such as concept art and basic storylines, but they still cannot replace human input or creativity at this time. Regarding programming, AI can only effectively generate simple code and act as an additional learning resource. Thus, generative AI tools are, at best, tools to enhance developer productivity rather than as a system to replace developers.Keywords: AAA games, AI, automation tools, game development
Procedia PDF Downloads 26964 Seismic Response Control of 20-Storey Benchmark Building Using True Negative Stiffness Device
Authors: Asim Qureshi, R. S. Jangid
Abstract:
Seismic response control of structures is generally achieved by using control devices which either dissipate the input energy or modify the dynamic properties of structure.In this paper, the response of a 20-storey benchmark building supplemented by viscous dampers and Negative Stiffness Device (NSD) is assessed by numerical simulations using the Newmark-beta method. True negative stiffness is an adaptive passive device which assists the motion unlike positive stiffness. The structure used in this study is subjected to four standard ground motions varying from moderate to severe, near fault to far-field earthquakes. The objective of the present study is to show the effectiveness of the adaptive negative stiffness device (NSD and passive dampers together) relative to passive dampers alone. This is done by comparing the responses of the above uncontrolled structure (i.e., without any device) with the structure having passive dampers only and also with the structure supplemented with adaptive negative stiffness device. Various performance indices, top floor displacement, top floor acceleration and inter-storey drifts are used as comparison parameters. It is found that NSD together with passive dampers is quite effective in reducing the response of aforementioned structure relative to structure without any device or passive dampers only. Base shear and acceleration is reduced significantly by incorporating NSD at the cost of increased inter-storey drifts which can be compensated using the passive dampers.Keywords: adaptive negative stiffness device, apparent yielding, NSD, passive dampers
Procedia PDF Downloads 431963 Wet Flue Gas Desulfurization Using a New O-Element Design Which Replaces the Venturi Scrubber
Authors: P. Lestinsky, D. Jecha, V. Brummer, P. Stehlik
Abstract:
Scrubbing by a liquid spraying is one of the most effective processes used for removal of fine particles and soluble gas pollutants (such as SO2, HCl, HF) from the flue gas. There are many configurations of scrubbers designed to provide contact between the liquid and gas stream for effectively capturing particles or soluble gas pollutants, such as spray plates, packed bed towers, jet scrubbers, cyclones, vortex and venturi scrubbers. The primary function of venturi scrubber is the capture of fine particles as well as HCl, HF or SO2 removal with effect of the flue gas temperature decrease before input to the absorption column. In this paper, sulfur dioxide (SO2) from flue gas was captured using new design replacing venturi scrubber (1st degree of wet scrubbing). The flue gas was prepared by the combustion of the carbon disulfide solution in toluene (1:1 vol.) in the flame in the reactor. Such prepared flue gas with temperature around 150 °C was processed in designed laboratory O-element scrubber. Water was used as absorbent liquid. The efficiency of SO2 removal, pressure drop and temperature drop were measured on our experimental device. The dependence of these variables on liquid-gas ratio was observed. The average temperature drop was in the range from 150 °C to 40 °C. The pressure drop was increased with increasing of a liquid-gas ratio, but not as much as for the common venturi scrubber designs. The efficiency of SO2 removal was up to 70 %. The pressure drop of our new designed wet scrubber is similar to commonly used venturi scrubbers; nevertheless the influence of amount of the liquid on pressure drop is not so significant.Keywords: desulphurization, absorption, flue gas, modeling
Procedia PDF Downloads 399