Search results for: methods of Lagrange multipliers.
3561 Improved IDR(s) Method for Gaining Very Accurate Solutions
Authors: Yusuke Onoue, Seiji Fujino, Norimasa Nakashima
Abstract:
The IDR(s) method based on an extended IDR theorem was proposed by Sonneveld and van Gijzen. The original IDR(s) method has excellent property compared with the conventional iterative methods in terms of efficiency and small amount of memory. IDR(s) method, however, has unexpected property that relative residual 2-norm stagnates at the level of less than 10-12. In this paper, an effective strategy for stagnation detection, stagnation avoidance using adaptively information of parameter s and improvement of convergence rate itself of IDR(s) method are proposed in order to gain high accuracy of the approximated solution of IDR(s) method. Through numerical experiments, effectiveness of adaptive tuning IDR(s) method is verified and demonstrated.
Keywords: Krylov subspace methods, IDR(s), adaptive tuning, stagnation of relative residual.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14733560 Impact of Enhanced Business Models on Technology Companies in the Pandemic: A Case Study about the Revolutionary Change in Management Styles
Authors: Murat Colak, Berkay Cakir Saridogan
Abstract:
Since the dawn of modern corporations, almost every single employee has been working in the same loop, which contains three basic steps: going to work, providing the needs for the work, and getting back home. Only a small amount of people was able to break that standard and live outside the box. As the 2019 pandemic hit the Earth and most companies shut down their physical offices, that loop had to change for everyone. This means that the old management styles had to be significantly re-arranged to the "work from home" type of business methods. The methods include online conferences and meetings, time and task tracking using algorithms, globalization of the work, and, most importantly, remote working. After the global epidemic started, even the tech giants were concerned. Now, it can be seen that those technology companies have an incredible step-up in their shares compared to the other companies because they know how to manage such situations even better than every other industry. This study aims to take the old traditional management styles in big companies and compare them with the post-Covid methods (2019-2022). As a result of this comparison made using the annual reports and shared statistics, this study aims to explain why the winners of this crisis are the technology companies.
Keywords: COVID-19, technology companies, business models, remote work.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3913559 Seat Assignment Problem Optimization
Authors: Mohammed Salem Alzahrani
Abstract:
In this paper the optimality of the solution of an existing real word assignment problem known as the seat assignment problem using Seat Assignment Method (SAM) is discussed. SAM is the newly driven method from three existing methods, Hungarian Method, Northwest Corner Method and Least Cost Method in a special way that produces the easiness & fairness among all methods that solve the seat assignment problem.Keywords: Assignment Problem, Hungarian Method, Least Cost Method, Northwest Corner Method, Seat Assignment Method (SAM), A Real Word Assignment Problem.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 34463558 Recursive Algorithms for Image Segmentation Based on a Discriminant Criterion
Authors: Bing-Fei Wu, Yen-Lin Chen, Chung-Cheng Chiu
Abstract:
In this study, a new criterion for determining the number of classes an image should be segmented is proposed. This criterion is based on discriminant analysis for measuring the separability among the segmented classes of pixels. Based on the new discriminant criterion, two algorithms for recursively segmenting the image into determined number of classes are proposed. The proposed methods can automatically and correctly segment objects with various illuminations into separated images for further processing. Experiments on the extraction of text strings from complex document images demonstrate the effectiveness of the proposed methods.1
Keywords: image segmentation, multilevel thresholding, clustering, discriminant analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20363557 Quantum Modelling of AgHMoO4, CsHMoO4 and AgCsMoO4 Chemistry in the Field of Nuclear Power Plant Safety
Authors: Mohamad Saab, Sidi Souvi
Abstract:
In a major nuclear accident, the released fission products (FPs) and the structural materials are likely to influence the transport of iodine in the reactor coolant system (RCS) of a pressurized water reactor (PWR). So far, the thermodynamic data on cesium and silver species used to estimate the magnitude of FP release show some discrepancies, data are scarce and not reliable. For this reason, it is crucial to review the thermodynamic values related to cesium and silver materials. To this end, we have used state-of-the-art quantum chemical methods to compute the formation enthalpies and entropies of AgHMoO₄, CsHMoO₄, and AgCsMoO₄ in the gas phase. Different quantum chemical methods have been investigated (DFT and CCSD(T)) in order to predict the geometrical parameters and the energetics including the correlation energy. The geometries were optimized with TPSSh-5%HF method, followed by a single point calculation of the total electronic energies using the CCSD(T) wave function method. We thus propose with a final uncertainty of about 2 kJmol⁻¹ standard enthalpies of formation of AgHMoO₄, CsHMoO₄, and AgCsMoO₄.
Keywords: ASTEC, Accident Source Term Evaluation Code, quantum chemical methods, severe nuclear accident, thermochemical database.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8213556 Recursive Filter for Coastal Displacement Estimation
Authors: Efstratios Doukakis, Nikolaos Petrelis
Abstract:
All climate models agree that the temperature in Greece will increase in the range of 1° to 2°C by the year 2030 and mean sea level in Mediterranean is expected to rise at the rate of 5 cm/decade. The aim of the present paper is the estimation of the coastline displacement driven by the climate change and sea level rise. In order to achieve that, all known statistical and non-statistical computational methods are employed on some Greek coastal areas. Furthermore, Kalman filtering techniques are for the first time introduced, formulated and tested. Based on all the above, shoreline change signals and noises are computed and an inter-comparison between the different methods can be deduced to help evaluating which method is most promising as far as the retrieve of shoreline change rate is concerned.Keywords: Climate Change, Coastal Displacement, KalmanFilter
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14073555 Incorporation of Long-Term Redundancy in ECG Time Domain Compression Methods through Curve Simplification and Block-Sorting
Authors: Bachir Boucheham, Youcef Ferdi, Mohamed Chaouki Batouche
Abstract:
We suggest a novel method to incorporate longterm redundancy (LTR) in signal time domain compression methods. The proposition is based on block-sorting and curve simplification. The proposition is illustrated on the ECG signal as a post-processor for the FAN method. Test applications on the new so-obtained FAN+ method using the MIT-BIH database show substantial improvement of the compression ratio-distortion behavior for a higher quality reconstructed signal.Keywords: ECG compression, Long-term redundancy, Block-sorting, Curve Simplification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15213554 Advanced Image Analysis Tools Development for the Early Stage Bronchial Cancer Detection
Authors: P. Bountris, E. Farantatos, N. Apostolou
Abstract:
Autofluorescence (AF) bronchoscopy is an established method to detect dysplasia and carcinoma in situ (CIS). For this reason the “Sotiria" Hospital uses the Karl Storz D-light system. However, in early tumor stages the visualization is not that obvious. With the help of a PC, we analyzed the color images we captured by developing certain tools in Matlab®. We used statistical methods based on texture analysis, signal processing methods based on Gabor models and conversion algorithms between devicedependent color spaces. Our belief is that we reduced the error made by the naked eye. The tools we implemented improve the quality of patients' life.Keywords: Bronchoscopy, digital image processing, lung cancer, texture analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14333553 Factors Affecting the e-Business Adoption among the Home-Based Businesses (HBBs) in Malaysia
Authors: S, Rosnafisah, M.S., Siti Salbiah., A, Mohd Sharifuddin
Abstract:
Research in e-Business has been growing tremendously covering all related aspects such as adoption issues, e- Business models, strategies, etc. This research aims to explore the potential of adopting e-Business for a micro size business operating from home called home-based businesses (HBBs). In Malaysia, the HBB industry started many years ago and were mostly monopolized by women or housewives managed as a part-time job to support their family economy. Today, things have changed. The availability of the Internet technology and the emergence of e-Business concept promote the evolution of HBBs, which have been adopted as another alternative as a professional career for women without neglecting their family needs especially the children. Although this study is confined to a limited sample size and within geographical biasness, the findings show that it concurs with previous large scale studies. In this study, both qualitative and quantitative methods were used and data were gathered using triangulation methods via interview, direct observation, document analysis and survey questionnaires. This paper discusses the literature review, research methods and findings pertaining to e-Business adoption factors that influence the HBBs in Malaysia.Keywords: e-Business, HBB, adoption factor, qualitative andquantitative
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27113552 Exploring Counting Methods for the Vertices of Certain Polyhedra with Uncertainties
Authors: Sammani Danwawu Abdullahi
Abstract:
Vertex Enumeration Algorithms explore the methods and procedures of generating the vertices of general polyhedra formed by system of equations or inequalities. These problems of enumerating the extreme points (vertices) of general polyhedra are shown to be NP-Hard. This lead to exploring how to count the vertices of general polyhedra without listing them. This is also shown to be #P-Complete. Some fully polynomial randomized approximation schemes (fpras) of counting the vertices of some special classes of polyhedra associated with Down-Sets, Independent Sets, 2-Knapsack problems and 2 x n transportation problems are presented together with some discovered open problems.Keywords: Approximation, counting with uncertainties, mathematical programming, optimization, vertex enumeration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13613551 Simulating Flow Transients in Conveying Pipeline Systems by Rigid Column and Full Elastic Methods: Pump Combined with Air Chamber
Authors: I. Abuiziah, A. Oulhaj, K. Sebari, D. Ouazar, A. A. Saber
Abstract:
In water pipeline systems, the flow control is an integrated part of the operation, for instance, opening and closing the valves, starting and stopping the pumps, when these operations very quickly performed, they shall cause the hydraulic transient phenomena, which may cause pump and, valve failures and catastrophic pipe ruptures. Fluid transient analysis is one of the more challenging and complicated flow problems in the design and the operation of water pipeline systems. Transient control has become an essential requirement for ensuring safe operation of water pipeline systems. An accurate analysis and suitable protection devices should be used to protect water pipeline systems. The fourth-order Runge-Kutta method has been used to solve the dynamic and continuity equations in the rigid column method, while the characteristics method used to solve these equations in the full elastic methods. This paper presents the problem of modeling and simulating of transient phenomena in conveying pipeline systems based on the rigid column and full elastic methods. Also, it provides the influence of using the protection devices to protect the pipeline systems from damaging due to the gain pressure which occur in the transient state. The results obtained provide that the model is an efficient tool for flow transient analysis and provide approximately identical results by using these two methods. Moreover; using the closed surge tank reduces the unfavorable effects of transients.
Keywords: Flow transient, Pipeline, Air chamber, Numerical model, Protection devices, Elastic method, Rigid column method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44073550 Blind Identification of MA Models Using Cumulants
Authors: Mohamed Boulouird, Moha M'Rabet Hassani
Abstract:
In this paper, many techniques for blind identification of moving average (MA) process are presented. These methods utilize third- and fourth-order cumulants of the noisy observations of the system output. The system is driven by an independent and identically distributed (i.i.d) non-Gaussian sequence that is not observed. Two nonlinear optimization algorithms, namely the Gradient Descent and the Gauss-Newton algorithms are exposed. An algorithm based on the joint-diagonalization of the fourth-order cumulant matrices (FOSI) is also considered, as well as an improved version of the classical C(q, 0, k) algorithm based on the choice of the Best 1-D Slice of fourth-order cumulants. To illustrate the effectiveness of our methods, various simulation examples are presented.
Keywords: Cumulants, Identification, MA models, Parameter estimation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14093549 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data
Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu
Abstract:
Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.
Keywords: Food waste reduction, particle filter, point of sales, sustainable development goals, Taylor's Law, time series analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8713548 Detection of Salmonella in Egg Shell and Egg Content from Different Housing Systems for Laying Hens
Authors: Wiriya Loongyai, Kiettisak Promphet, Nilubol Kangsukul, Ratchawat Noppha
Abstract:
Polymerase chain reaction (PCR) assay and conventional microbiological methods were used to detect bacterial contamination of egg shells and egg content in different commercial housing systems, open house system and evaporative cooling system. A PCR assay was developed for direct detection using a set of primers specific for the invasion by A gene (invA) of Salmonella spp. PCR detected the presence of Salmonella in 2 samples of shell egg from the evaporative cooling system, while conventional cultural methods detected no Salmonella from the same samples.Keywords: egg content, egg shell, invA gene, PCR, Salmonellaspp.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32983547 The Effect of Styrene-Butadiene-Rubber (SBR) Polymer Modifier on Properties of Bitumen
Authors: Seyed Abbas Tabatabaei, Alireza Kiasat, Ferdows Karimi Alkouhi
Abstract:
In order to use bitumen in hot mix asphalt, it must have specific characteristics. There are some methods to reach these properties. Using polymer modifiers are one of the methods to modify the bitumen properties. In this paper the effect of Styrene- Butadiene-Rubber that is one of the bitumen polymer modifiers on rheology properties of bitumen is studied. In this regard, the rheological properties of base bitumen and the modified bitumen with 3, 4, and 5 percent of Styrene-Butadiene-Rubber (SBR) were analysed. The results show that bitumen modified with 5 percent of SBR has the best performance than the other samples.
Keywords: Bitumen, polymer modifier, styrene-butadienerubber, rheological properties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43963546 An Investigation on Hot-Spot Temperature Calculation Methods of Power Transformers
Authors: Ahmet Y. Arabul, Ibrahim Senol, Fatma Keskin Arabul, Mustafa G. Aydeniz, Yasemin Oner, Gokhan Kalkan
Abstract:
In the standards of IEC 60076-2 and IEC 60076-7, three different hot-spot temperature estimation methods are suggested. In this study, the algorithms which used in hot-spot temperature calculations are analyzed by comparing the algorithms with the results of an experimental set-up made by a Transformer Monitoring System (TMS) in use. In tested system, TMS uses only top oil temperature and load ratio for hot-spot temperature calculation. And also, it uses some constants from standards which are on agreed statements tables. During the tests, it came out that hot-spot temperature calculation method is just making a simple calculation and not uses significant all other variables that could affect the hot-spot temperature.
Keywords: Hot-spot temperature, monitoring system, power transformer, smart grid.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31723545 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images
Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn
Abstract:
The detection and segmentation of mitochondria from fluorescence microscopy is crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. Although there exists a number of open-source software tools and artificial intelligence (AI) methods designed for analyzing mitochondrial images, the availability of only a few combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compactibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source Python and OpenCV library, the algorithms are implemented in three stages: pre-processing; image binarization; and coarse-to-fine segmentation. The proposed model is validated using the fluorescence mitochondrial dataset. Ground truth labels generated using Labkit were also used to evaluate the performance of our detection and segmentation model using precision, recall and rand index. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks concludes the paper.
Keywords: 2D, Binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4703544 Simulation Design of Separator for the Treatment of Emulsions
Authors: Irena Markovska, Dimitar Rusev, Nikolai Zaicev, Bogdan Bogdanov, Dimitar Georgiev, Yancho Hristov
Abstract:
A prototype model of an emulsion separator was designed and manufactured. Generally, it is a cylinder filled with different fractal modules. The emulsion was fed into the reactor by a peristaltic pump through an inlet placed at the boundary between the two phases. For hydrodynamic design and sizing of the reactor the assumptions of the theory of filtration were used and methods to describe the separation process were developed. Based on this methodology and using numerical methods and software of Autodesk the process is simulated in different operating modes. The basic hydrodynamic characteristics - speed and performance for different types of fractal systems and decisions to optimize the design of the reactor were also defined.Keywords: fractal systems, reactor, separation, emulsions
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17443543 An Overview of Issues to Consider Before Introducing Performance-Based Road Maintenance Contracting
Authors: M. Sultana, A. Rahman, S. Chowdhury
Abstract:
Road authorities have confronted problems to maintaining the serviceability of road infrastructure systems by using various traditional methods of contracting. As a solution to these problems, many road authorities have started contracting out road maintenance works to the private sector based on performance measures. This contracting method is named Performance-Based Maintenance Contracting (PBMC). It is considered more costeffective than other traditional methods of contracting. It has a substantial success records in many developed and developing countries over the last two decades. This paper discusses and analyses the potential issues to be considered before the introduction of PBMC in a country.Keywords: Contracting, Performance-Based Maintenance, Road infrastructure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17513542 A Functional Framework for Large Scale Application Software Systems
Authors: Han-hua Lu, Shun-yi Zhang, Yong Zheng, Ya-shi Wang, Li-juan Min
Abstract:
From the perspective of system of systems (SoS) and emergent behaviors, this paper describes large scale application software systems, and proposes framework methods to further depict systems- functional and non-functional characteristics. Besides, this paper also specifically discusses some functional frameworks. In the end, the framework-s applications in system disintegrations, system architecture and stable intermediate forms are additionally dealt with in this in building, deployment and maintenance of large scale software applications.Keywords: application software system, framework methods, system of systems, emergent behaviors
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13943541 Improving Spatiotemporal Change Detection: A High Level Fusion Approach for Discovering Uncertain Knowledge from Satellite Image Database
Authors: Wadii Boulila, Imed Riadh Farah, Karim Saheb Ettabaa, Basel Solaiman, Henda Ben Ghezala
Abstract:
This paper investigates the problem of tracking spa¬tiotemporal changes of a satellite image through the use of Knowledge Discovery in Database (KDD). The purpose of this study is to help a given user effectively discover interesting knowledge and then build prediction and decision models. Unfortunately, the KDD process for spatiotemporal data is always marked by several types of imperfections. In our paper, we take these imperfections into consideration in order to provide more accurate decisions. To achieve this objective, different KDD methods are used to discover knowledge in satellite image databases. Each method presents a different point of view of spatiotemporal evolution of a query model (which represents an extracted object from a satellite image). In order to combine these methods, we use the evidence fusion theory which considerably improves the spatiotemporal knowledge discovery process and increases our belief in the spatiotemporal model change. Experimental results of satellite images representing the region of Auckland in New Zealand depict the improvement in the overall change detection as compared to using classical methods.
Keywords: Knowledge discovery in satellite databases, knowledge fusion, data imperfection, data mining, spatiotemporal change detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15473540 Real-time 3D Feature Extraction without Explicit 3D Object Reconstruction
Authors: Kwangjin Hong, Chulhan Lee, Keechul Jung, Kyoungsu Oh
Abstract:
For the communication between human and computer in an interactive computing environment, the gesture recognition is studied vigorously. Therefore, a lot of studies have proposed efficient methods about the recognition algorithm using 2D camera captured images. However, there is a limitation to these methods, such as the extracted features cannot fully represent the object in real world. Although many studies used 3D features instead of 2D features for more accurate gesture recognition, the problem, such as the processing time to generate 3D objects, is still unsolved in related researches. Therefore we propose a method to extract the 3D features combined with the 3D object reconstruction. This method uses the modified GPU-based visual hull generation algorithm which disables unnecessary processes, such as the texture calculation to generate three kinds of 3D projection maps as the 3D feature: a nearest boundary, a farthest boundary, and a thickness of the object projected on the base-plane. In the section of experimental results, we present results of proposed method on eight human postures: T shape, both hands up, right hand up, left hand up, hands front, stand, sit and bend, and compare the computational time of the proposed method with that of the previous methods.Keywords: Fast 3D Feature Extraction, Gesture Recognition, Computer Vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16383539 Geometric Operators in the Selection of Human Resources
Authors: José M. Merigó, Anna M. Gil-Lafuente
Abstract:
We study the possibility of using geometric operators in the selection of human resources. We develop three new methods that use the ordered weighted geometric (OWG) operator in different indexes used for the selection of human resources. The objective of these models is to manipulate the neutrality of the old methods so the decision maker is able to select human resources according to his particular attitude. In order to develop these models, first a short revision of the OWG operator is developed. Second, we briefly explain the general process for the selection of human resources. Then, we develop the three new indexes. They will use the OWG operator in the Hamming distance, in the adequacy coefficient and in the index of maximum and minimum level. Finally, an illustrative example about the new approach is given.Keywords: OWG operator, decision making, human resources, Hamming distance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14023538 Effect of Rubber Treatment on Compressive Strength and Modulus of Elasticity of Self-Compacting Rubberized Concrete
Authors: I. Miličević, M. Hadzima Nyarko, R. Bušić, J. Simonović Radosavljević, M. Prokopijević, K. Vojisavljević
Abstract:
This paper investigates the effects of different treatment methods of rubber aggregates for self-compacting concrete (SCC) on compressive strength and modulus of elasticity. SCC mixtures with 10% replacement of fine aggregate with crumb rubber by total aggregate volume and with different aggregate treatment methods were investigated. The rubber aggregate was treated in three different methods: dry process, water-soaking, and NaOH treatment plus water soaking. Properties of SCC in a fresh and hardened state were tested and evaluated. Scanning electron microscope (SEM) analysis of three different SCC patches were made and discussed. It was observed that applying the proposed NaOH plus water soaking method resulted in the improvement of fresh and hardened concrete properties. It resulted in a more uniform distribution of rubber particles in the cement matrix, a better bond between rubber particles and the cement matrix, and higher compressive strength of SCC rubberized concrete.
Keywords: Compressive strength, modulus of elasticity, NaOH treatment, rubber aggregate, self-compacting rubberized concrete, scanning electron microscope analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6413537 Methods for Material and Process Monitoring by Characterization of (Second and Third Order) Elastic Properties with Lamb Waves
Abstract:
In accordance with the industry 4.0 concept, manufacturing process steps as well as the materials themselves are going to be more and more digitalized within the next years. The “digital twin” representing the simulated and measured dataset of the (semi-finished) product can be used to control and optimize the individual processing steps and help to reduce costs and expenditure of time in product development, manufacturing, and recycling. In the present work, two material characterization methods based on Lamb waves were evaluated and compared. For demonstration purpose, both methods were shown at a standard industrial product - copper ribbons, often used in photovoltaic modules as well as in high-current microelectronic devices. By numerical approximation of the Rayleigh-Lamb dispersion model on measured phase velocities second order elastic constants (Young’s modulus, Poisson’s ratio) were determined. Furthermore, the effective third order elastic constants were evaluated by applying elastic, “non-destructive”, mechanical stress on the samples. In this way, small microstructural variations due to mechanical preconditioning could be detected for the first time. Both methods were compared with respect to precision and inline application capabilities. Microstructure of the samples was systematically varied by mechanical loading and annealing. Changes in the elastic ultrasound transport properties were correlated with results from microstructural analysis and mechanical testing. In summary, monitoring the elastic material properties of plate-like structures using Lamb waves is valuable for inline and non-destructive material characterization and manufacturing process control. Second order elastic constants analysis is robust over wide environmental and sample conditions, whereas the effective third order elastic constants highly increase the sensitivity with respect to small microstructural changes. Both Lamb wave based characterization methods are fitting perfectly into the industry 4.0 concept.
Keywords: Lamb waves, industry 4.0, process control, elasticity, acoustoelasticity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10993536 Research of Linear Camera Calibration Based on Planar Pattern
Authors: Jin Sun, Hongbin Gu
Abstract:
An important step in three-dimensional reconstruction and computer vision is camera calibration, whose objective is to estimate the intrinsic and extrinsic parameters of each camera. In this paper, two linear methods based on the different planes are given. In both methods, the general plane is used to replace the calibration object with very good precision. In the first method, after controlling the camera to undergo five times- translation movements and taking pictures of the orthogonal planes, a set of linear constraints of the camera intrinsic parameters is then derived by means of homography matrix. The second method is to get all camera parameters by taking only one picture of a given radius circle. experiments on simulated data and real images,indicate that our method is reasonable and is a good supplement to camera calibration.Keywords: camera calibration, 3D reconstruction, computervision
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18303535 On Discretization of Second-order Derivatives in Smoothed Particle Hydrodynamics
Authors: R. Fatehi, M.A. Fayazbakhsh, M.T. Manzari
Abstract:
Discretization of spatial derivatives is an important issue in meshfree methods especially when the derivative terms contain non-linear coefficients. In this paper, various methods used for discretization of second-order spatial derivatives are investigated in the context of Smoothed Particle Hydrodynamics. Three popular forms (i.e. "double summation", "second-order kernel derivation", and "difference scheme") are studied using one-dimensional unsteady heat conduction equation. To assess these schemes, transient response to a step function initial condition is considered. Due to parabolic nature of the heat equation, one can expect smooth and monotone solutions. It is shown, however in this paper, that regardless of the type of kernel function used and the size of smoothing radius, the double summation discretization form leads to non-physical oscillations which persist in the solution. Also, results show that when a second-order kernel derivative is used, a high-order kernel function shall be employed in such a way that the distance of inflection point from origin in the kernel function be less than the nearest particle distance. Otherwise, solutions may exhibit oscillations near discontinuities unlike the "difference scheme" which unconditionally produces monotone results.Keywords: Heat conduction, Meshfree methods, Smoothed ParticleHydrodynamics (SPH), Second-order derivatives.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 30883534 Seismic Vulnerability of Structures Designed in Accordance with the Allowable Stress Design and Load Resistant Factor Design Methods
Authors: Mohammadreza Vafaei, Amirali Moradi, Sophia C. Alih
Abstract:
The method selected for the design of structures not only can affect their seismic vulnerability but also can affect their construction cost. For the design of steel structures, two distinct methods have been introduced by existing codes, namely allowable stress design (ASD) and load resistant factor design (LRFD). This study investigates the effect of using the aforementioned design methods on the seismic vulnerability and construction cost of steel structures. Specifically, a 20-story building equipped with special moment resisting frame and an eccentrically braced system was selected for this study. The building was designed for three different intensities of peak ground acceleration including 0.2 g, 0.25 g, and 0.3 g using the ASD and LRFD methods. The required sizes of beams, columns, and braces were obtained using response spectrum analysis. Then, the designed frames were subjected to nine natural earthquake records which were scaled to the designed response spectrum. For each frame, the base shear, story shears, and inter-story drifts were calculated and then were compared. Results indicated that the LRFD method led to a more economical design for the frames. In addition, the LRFD method resulted in lower base shears and larger inter-story drifts when compared with the ASD method. It was concluded that the application of the LRFD method not only reduced the weights of structural elements but also provided a higher safety margin against seismic actions when compared with the ASD method.
Keywords: Allowable stress design, load resistant factor design, nonlinear time history analysis, seismic vulnerability, steel structures.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11103533 Overview of Operational Risk Management Methods
Authors: Milan Rippel, Pert Teplý
Abstract:
Operational risk has become one of the most discussed topics in the financial industry in the recent years. The reasons for this attention can be attributed to higher investments in information systems and technology, the increasing wave of mergers and acquisitions and emergence of new financial instruments. In addition, the New Basel Capital Accord (known as Basel II) demands a capital requirement for operational risk and further motivates financial institutions to more precisely measure and manage this type of risk. The aim of this paper is to shed light on main characteristics of operational risk management and common applied methods: scenario analysis, key risk indicators, risk control self assessment and loss distribution approach.
Keywords: Operational risk, economic capital, key risk indicators, loss distribution approach.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39643532 Activity Recognition by Smartphone Accelerometer Data Using Ensemble Learning Methods
Authors: Eu Tteum Ha, Kwang Ryel Ryu
Abstract:
As smartphones are equipped with various sensors, there have been many studies focused on using these sensors to create valuable applications. Human activity recognition is one such application motivated by various welfare applications, such as the support for the elderly, measurement of calorie consumption, lifestyle and exercise patterns analyses, and so on. One of the challenges one faces when using smartphone sensors for activity recognition is that the number of sensors should be minimized to save battery power. In this paper, we show that a fairly accurate classifier can be built that can distinguish ten different activities by using only a single sensor data, i.e., the smartphone accelerometer data. The approach that we adopt to deal with this twelve-class problem uses various methods. The features used for classifying these activities include not only the magnitude of acceleration vector at each time point, but also the maximum, the minimum, and the standard deviation of vector magnitude within a time window. The experiments compared the performance of four kinds of basic multi-class classifiers and the performance of four kinds of ensemble learning methods based on three kinds of basic multi-class classifiers. The results show that while the method with the highest accuracy is ECOC based on Random forest.
Keywords: Ensemble learning, activity recognition, smartphone accelerometer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173