Search results for: Interpolation method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19003

Search results for: Interpolation method

18283 A Study on Weight-Reduction of Double Deck High-Speed Train Using Size Optimization Method

Authors: Jong-Yeon Kim, Kwang-Bok Shin, Tae-Hwan Ko

Abstract:

The purpose of this paper is to suggest a weight-reduction design method for the aluminum extrusion carbody structure of a double deck high-speed train using size optimization method. The size optimization method was used to optimize thicknesses of skin and rib of the aluminum extrusion for the carbody structure. Thicknesses of 1st underframe, 2nd underframe, solebar and roof frame were selected by design variables in order to conduct size optimization. The results of the size optimization analysis showed that the weight of the aluminum extrusion could be reduced by 0.61 tons (5.60%) compared to the weight of the original carbody structure.

Keywords: double deck high-speed train, size optimization, weigh-reduction, aluminum extrusion

Procedia PDF Downloads 290
18282 COVID–19 Impact on Passenger and Cargo Traffic: A Case Study

Authors: Maja Čović, Josipa Bojčić, Bruna Bacalja, Gorana Jelić Mrčelić

Abstract:

The appearance of the COVID-19 disease and its fast-spreading brought global pandemic and health crisis. In order to prevent the further spreading of the virus, the governments had implemented mobility restriction rules which left a negative mark on the world’s economy. Although there is numerous research on the impact of COVID-19 on marine traffic around the world, the objective of this paper is to consider the impact of COVID-19 on passenger and cargo traffic in Port of Split, in the Republic of Croatia. Methods used to make the theoretical and research part of the paper are descriptive method, comparative method, compilation, inductive method, deductive method, and statistical method. Paper relies on data obtained via Port of Split Authority and analyses trends in passenger and cargo traffic, including the year 2020, when the pandemic broke. Significant reductions in income, disruptions in transportation and traffic, as well as other maritime services are shown in the paper. This article also observes a significant decline in passenger traffic, cruising traffic and also observes the dynamic of cargo traffic inside the port of Split.

Keywords: COVID-19, pandemic, passenger traffic, ports, trends, cargo traffic

Procedia PDF Downloads 216
18281 Development of the Analysis and Pretreatment of Brown HT in Foods

Authors: Hee-Jae Suh, Mi-Na Hong, Min-Ji Kim, Yeon-Seong Jeong, Ok-Hwan Lee, Jae-Wook Shin, Hyang-Sook Chun, Chan Lee

Abstract:

Brown HT is a bis-azo dye which is permitted in EU as a food colorant. So far, many studies have focused on HPLC using diode array detection (DAD) analysis for detection of this food colorant with different columns and mobile phases. Even though these methods make it possible to detect Brown HT, low recovery, reproducibility, and linearity are still the major limitations for the application in foods. The purpose of this study was to compare various methods for the analysis of Brown HT and to develop an improved analytical methods including pretreatment. Among tested analysis methods, best resolution of Brown HT was observed when the following solvent was applied as a eluent; solvent A of mobile phase was 0.575g NH4H2PO4, and 0.7g Na2HPO4 in 500mL water added with 500mL methanol. The pH was adjusted using phosphoric acid to pH 6.9 and solvent B was methanol. Major peak for Brown HT appeared at the end of separation, 13.4min after injection. This method exhibited relatively high recovery and reproducibility compared with other methods. LOD (0.284 ppm), LOQ (0.861 ppm), resolution (6.143), and selectivity (1.3) of this method were better than those of ammonium acetate solution method which was most frequently used. Precision and accuracy were verified through inter-day test and intra-day test. Various methods for sample pretreatments were developed for different foods and relatively high recovery over 80% was observed in all case. This method exhibited high resolution and reproducibility of Brown HT compared with other previously reported official methods from FSA and, EU regulation.

Keywords: analytic method, Brown HT, food colorants, pretreatment method

Procedia PDF Downloads 478
18280 Photoluminescence in Cerium Doped Fluorides Prepared by Slow Precipitation Method

Authors: Aarti Muley, S. J. Dhoblae

Abstract:

CaF₂ and BaF₂ doped with cerium were prepared by slow precipitation method with different molar concentration and different cerium concentration. Both the samples were also prepared by direct method for comparison. The XRD of BaF₂:Ce shows that it crystallizes to BCC structure. The peak matches with JCPDS file no. 4-0452. Also, The XRD pattern of CaF₂:Ce matches well with the JCPDS file number 75- 0363 and crystallized to BCC phase. In CaF₂, the double-humped photoluminescence spectra were observed at 320nm and 340nm when the sample was prepared by the direct precipitation method, and the ratio between these peaks is unity. However when the sample prepared by slow precipitation method the double-humped emission spectra of CaF₂:Ce was observed at 323nm and 340nm. The ratio between these peaks is 0.58, and the optimum concentration is obtained for 0.1 molar CaF₂ with Ce concentration 1.5%. When the cerium concentration is increased by 2% the peak at 323nm vanishes, and the emission was observed at 342nm with the shoulder at 360nm. In this case, the intensity reduces drastically. The excitation is observed at 305nm with a small peak at 254nm. One molar BaF₂ doped with 0.1% of cerium was synthesized by direct precipitation method gives double humped spectra at 308nm and 320nm, when it is prepared with slow precipitation method with the cerium concentration 0.05m%, 0.1m%, 0.15m%, 0.2m% the broad emission is observed around 325nm with the shoulder at 350nm. The excitation spectra are narrow and observed at 290nm. As the percentage of cerium is increased further again shift is observed. The emission spectra were observed at 360nm with a small peak at 330nm. The phenomenon of shifting of emission spectra at low concentration of cerium can directly relate with the particle size and reported for nanomaterials also.

Keywords: calcium fluoride, barium fluoride, photoluminescence, slow precipitation method

Procedia PDF Downloads 108
18279 Vibration Analysis of Pendulum in a Viscous Fluid by Analytical Methods

Authors: Arash Jafari, Mehdi Taghaddosi, Azin Parvin

Abstract:

In this study, a vibrational differential equation governing on swinging single-degree-of-freedom pendulum in a viscous fluid has been investigated. The damping process is characterized according to two different regimes: at first, damping in stationary viscous fluid, in the second, damping in flowing viscous fluid with constant velocity. Our purpose is to enhance the ability of solving the mentioned nonlinear differential equation with a simple and innovative approach. Comparisons are made between new method and Numerical Method (rkf45). The results show that this method is very effective and simple and can be applied for other nonlinear problems.

Keywords: oscillating systems, angular frequency and damping ratio, pendulum at fluid, locus of maximum

Procedia PDF Downloads 337
18278 Investigation of the Effect of Teaching Thinking and Research Lesson by Cooperative and Traditional Methods on Creativity of Sixth Grade Students

Authors: Faroogh Khakzad, Marzieh Dehghani, Elahe Hejazi

Abstract:

The present study investigates the effect of teaching a Thinking and Research lesson by cooperative and traditional methods on the creativity of sixth-grade students in Piranshahr province. The statistical society includes all the sixth-grade students of Piranshahr province. The sample of this studytable was selected by available sampling from among male elementary schools of Piranshahr. They were randomly assigned into two groups of cooperative teaching method and traditional teaching method. The design of the study is quasi-experimental with a control group. In this study, to assess students’ creativity, Abedi’s creativity questionnaire was used. Based on Cronbach’s alpha coefficient, the reliability of the factor flow was 0.74, innovation was 0.61, flexibility was 0.63, and expansion was 0.68. To analyze the data, t-test, univariate and multivariate covariance analysis were used for evaluation of the difference of means and the pretest and posttest scores. The findings of the research showed that cooperative teaching method does not significantly increase creativity (p > 0.05). Moreover, cooperative teaching method was found to have significant effect on flow factor (p < 0.05), but in innovation and expansion factors no significant effect was observed (p < 0.05).

Keywords: cooperative teaching method, traditional teaching method, creativity, flow, innovation, flexibility, expansion, thinking and research lesson

Procedia PDF Downloads 316
18277 The Effect of Goal Setting on Psychological Status and Freestyle Swimming Performance in Young Competitive Swimmers

Authors: Sofiene Amara, Mohamed Ali Bahri, Sabri Gaied Chortane

Abstract:

The purpose of this study was to examine the effect of personal goal setting on psychological parameters (cognitive anxiety, somatic anxiety, and self-confidence) and the 50m freestyle performance. 30 young swimmers participated in this investigation, and was divided into three groups, the first group (G1, n = 10, 14 ± 0.7 years old) was prepared for the competition without a fixed target (method 1), the second group (G2, n = 10, 14 ± 0.9 years old) was oriented towards a vague goal 'Do your best' (method 2), while the third group (G3, n = 10, 14 ± 0, 5 years old) was invited to answer a goal that is difficult to reach according to a goal-setting interval (GST) (method 3). According to the statistical data of the present investigation, the cognitive and somatic anxiety scores in G1 and G3 were higher than in G2 (G1-G2, G3-G2: cognitive anxiety, P = 0.000, somatic anxiety: P = 0.000 respectively). On the other hand, the self-confidence score was lower in G1 compared with the other two groups (G1-G2, G3-G2: P = 0.02, P = 0.03 respectively). Our assessment also shows that the 50m freestyle time performance was improved better by method 3 (pre and post-Test: P = 0.006, -2.5sec, 7.83%), than by method 2 (pre and Post-Test: P = 0.03; -1sec; 3.24%), while, performance remained unchanged in G1 (P > 0.05). To conclude, the setting of a difficult goal by GST is more effective to improve the chronometric performance in the 50m freestyle, but at the same time increased the values ​​of the cognitive and somatic anxiety. For this, the mental trainers and the staff technical, invited to develop models of mental preparation associated with this method of setting a goal to help swimmers on the psychological level.

Keywords: cognitive anxiety, goal setting, performance of swimming freestyle, self-confidence, somatic anxiety

Procedia PDF Downloads 129
18276 Exact Solutions for Steady Response of Nonlinear Systems under Non-White Excitation

Authors: Yaping Zhao

Abstract:

In the present study, the exact solutions for the steady response of quasi-linear systems under non-white wide-band random excitation are considered by means of the stochastic averaging method. The non linearity of the systems contains the power-law damping and the cross-product term of the power-law damping and displacement. The drift and diffusion coefficients of the Fokker-Planck-Kolmogorov (FPK) equation after averaging are obtained by a succinct approach. After solving the averaged FPK equation, the joint probability density function and the marginal probability density function in steady state are attained. In the process of resolving, the eigenvalue problem of ordinary differential equation is handled by integral equation method. Some new results are acquired and the novel method to deal with the problems in nonlinear random vibration is proposed.

Keywords: random vibration, stochastic averaging method, FPK equation, transition probability density

Procedia PDF Downloads 503
18275 The Effect of Conservative Tillage on Physical Properties of Soil and Yield of Rainfed Wheat

Authors: Abolfazl Hedayatipoor, Mohammad Younesi Alamooti

Abstract:

In order to study the effect of conservative tillage on a number of physical properties of soil and the yield of rainfed wheat, an experiment in the form of a randomized complete block design (RCBD) with three replications was conducted in a field in Aliabad County, Iran. The study treatments included: T1) Conventional method, T2) Combined moldboard plow method, T3) Chisel-packer method, and T4) Direct planting method. During early October, the study soil was prepared based on these treatments in a field which was used for rainfed wheat farming in the previous year. The apparent specific gravity of soil, weighted mean diameter (WMD) of soil aggregates, soil mechanical resistance, and soil permeability were measured. Data were analyzed in MSTAT-C. Results showed that the tillage practice had no significant effect on grain yield (p < 0.05). Soil permeability was 10.9, 16.3, 15.7 and 17.9 mm/h for T1, T2, T3 and T4, respectively.

Keywords: rainfed agriculture, conservative tillage, energy consumption, wheat

Procedia PDF Downloads 206
18274 Quality by Design in the Optimization of a Fast HPLC Method for Quantification of Hydroxychloroquine Sulfate

Authors: Pedro J. Rolim-Neto, Leslie R. M. Ferraz, Fabiana L. A. Santos, Pablo A. Ferreira, Ricardo T. L. Maia-Jr., Magaly A. M. Lyra, Danilo A F. Fonte, Salvana P. M. Costa, Amanda C. Q. M. Vieira, Larissa A. Rolim

Abstract:

Initially developed as an antimalarial agent, hydroxychloroquine (HCQ) sulfate is often used as a slow-acting antirheumatic drug in the treatment of disorders of connective tissue. The United States Pharmacopeia (USP) 37 provides a reversed-phase HPLC method for quantification of HCQ. However, this method was not reproducible, producing asymmetric peaks in a long analysis time. The asymmetry of the peak may cause an incorrect calculation of the concentration of the sample. Furthermore, the analysis time is unacceptable, especially regarding the routine of a pharmaceutical industry. The aiming of this study was to develop a fast, easy and efficient method for quantification of HCQ sulfate by High Performance Liquid Chromatography (HPLC) based on the Quality by Design (QbD) methodology. This method was optimized in terms of peak symmetry using the surface area graphic as the Design of Experiments (DoE) and the tailing factor (TF) as an indicator to the Design Space (DS). The reference method used was that described at USP 37 to the quantification of the drug. For the optimized method, was proposed a 33 factorial design, based on the QbD concepts. The DS was created with the TF (in a range between 0.98 and 1.2) in order to demonstrate the ideal analytical conditions. Changes were made in the composition of the USP mobile-phase (USP-MP): USP-MP: Methanol (90:10 v/v, 80:20 v/v and 70:30 v/v), in the flow (0.8, 1.0 and 1.2 mL) and in the oven temperature (30, 35, and 40ºC). The USP method allowed the quantification of drug in a long time (40-50 minutes). In addition, the method uses a high flow rate (1,5 mL.min-1) which increases the consumption of expensive solvents HPLC grade. The main problem observed was the TF value (1,8) that would be accepted if the drug was not a racemic mixture, since the co-elution of the isomers can become an unreliable peak integration. Therefore, the optimization was suggested in order to reduce the analysis time, aiming a better peak resolution and TF. For the optimization method, by the analysis of the surface-response plot it was possible to confirm the ideal setting analytical condition: 45 °C, 0,8 mL.min-1 and 80:20 USP-MP: Methanol. The optimized HPLC method enabled the quantification of HCQ sulfate, with a peak of high resolution, showing a TF value of 1,17. This promotes good co-elution of isomers of the HCQ, ensuring an accurate quantification of the raw material as racemic mixture. This method also proved to be 18 times faster, approximately, compared to the reference method, using a lower flow rate, reducing even more the consumption of the solvents and, consequently, the analysis cost. Thus, an analytical method for the quantification of HCQ sulfate was optimized using QbD methodology. This method proved to be faster and more efficient than the USP method, regarding the retention time and, especially, the peak resolution. The higher resolution in the chromatogram peaks supports the implementation of the method for quantification of the drug as racemic mixture, not requiring the separation of isomers.

Keywords: analytical method, hydroxychloroquine sulfate, quality by design, surface area graphic

Procedia PDF Downloads 639
18273 Development of In Situ Permeability Test Using Constant Discharge Method for Sandy Soils

Authors: A. Rifa’i, Y. Takeshita, M. Komatsu

Abstract:

The post-rain puddles problem that occurs in the first yard of Prambanan Temple are often disturbing visitor activity. A poodle layer and a drainage system has ever built to avoid such a problem, but puddles still didn’t stop appearing after rain. Permeability parameter needs to be determined by using more simple procedure to find exact method of solution. The instrument modelling were proposed according to the development of field permeability testing instrument. This experiment used proposed Constant Discharge method. Constant Discharge method used a tube poured with constant water flow. The procedure were carried out from unsaturated until saturated soil condition. Volumetric water content (θ) were being monitored by soil moisture measurement device. The results were relationship between k and θ which drawn by numerical approach Van Genutchen model. Parameters θr optimum value obtained from the test was at very dry soil. Coefficient of permeability with a density of 19.8 kN/m3 for unsaturated conditions was in range of 3 x 10-6 cm/sec (Sr= 68 %) until 9.98 x 10-4 cm/sec (Sr= 82 %). The equipment and testing procedure developed in this research was quite effective, simple and easy to be implemented on determining field soil permeability coefficient value of sandy soil. Using constant discharge method in proposed permeability test, value of permeability coefficient under unsaturated condition can be obtained without establish soil water characteristic curve.

Keywords: constant discharge method, in situ permeability test, sandy soil, unsaturated conditions

Procedia PDF Downloads 383
18272 Numerical Modelling of Dry Stone Masonry Structures Based on Finite-Discrete Element Method

Authors: Ž. Nikolić, H. Smoljanović, N. Živaljić

Abstract:

This paper presents numerical model based on finite-discrete element method for analysis of the structural response of dry stone masonry structures under static and dynamic loads. More precisely, each discrete stone block is discretized by finite elements. Material non-linearity including fracture and fragmentation of discrete elements as well as cyclic behavior during dynamic load are considered through contact elements which are implemented within a finite element mesh. The application of the model was conducted on several examples of these structures. The performed analysis shows high accuracy of the numerical results in comparison with the experimental ones and demonstrates the potential of the finite-discrete element method for modelling of the response of dry stone masonry structures.

Keywords: dry stone masonry structures, dynamic load, finite-discrete element method, static load

Procedia PDF Downloads 414
18271 Combining the Fictitious Stress Method and Displacement Discontinuity Method in Solving Crack Problems in Anisotropic Material

Authors: Bahatti̇n Ki̇mençe, Uğur Ki̇mençe

Abstract:

In this study, the purpose of obtaining the influence functions of the displacement discontinuity in an anisotropic elastic medium is to produce the boundary element equations. A Displacement Discontinuous Method formulation (DDM) is presented with the aim of modeling two-dimensional elastic fracture problems. This formulation is found by analytical integration of the fundamental solution along a straight-line crack. With this purpose, Kelvin's fundamental solutions for anisotropic media on an infinite plane are used to form dipoles from singular loads, and the various combinations of the said dipoles are used to obtain the influence functions of displacement discontinuity. This study introduces a technique for coupling Fictitious Stress Method (FSM) and DDM; the reason for applying this technique to some examples is to demonstrate the effectiveness of the proposed coupling method. In this study, displacement discontinuity equations are obtained by using dipole solutions calculated with known singular force solutions in an anisotropic medium. The displacement discontinuities method obtained from the solutions of these equations and the fictitious stress methods is combined and compared with various examples. In this study, one or more crack problems with various geometries in rectangular plates in finite and infinite regions, under the effect of tensile stress with coupled FSM and DDM in the anisotropic environment, were examined, and the effectiveness of the coupled method was demonstrated. Since crack problems can be modeled more easily with DDM, it has been observed that the use of DDM has increased recently. In obtaining the displacement discontinuity equations, Papkovitch functions were used in Crouch, and harmonic functions were chosen to satisfy various boundary conditions. A comparison is made between two indirect boundary element formulations, DDM, and an extension of FSM, for solving problems involving cracks. Several numerical examples are presented, and the outcomes are contrasted to existing analytical or reference outs.

Keywords: displacement discontinuity method, fictitious stress method, crack problems, anisotropic material

Procedia PDF Downloads 75
18270 A Novel Combination Method for Computing the Importance Map of Image

Authors: Ahmad Absetan, Mahdi Nooshyar

Abstract:

The importance map is an image-based measure and is a core part of the resizing algorithm. Importance measures include image gradients, saliency and entropy, as well as high level cues such as face detectors, motion detectors and more. In this work we proposed a new method to calculate the importance map, the importance map is generated automatically using a novel combination of image edge density and Harel saliency measurement. Experiments of different type images demonstrate that our method effectively detects prominent areas can be used in image resizing applications to aware important areas while preserving image quality.

Keywords: content-aware image resizing, visual saliency, edge density, image warping

Procedia PDF Downloads 582
18269 Speedup Breadth-First Search by Graph Ordering

Authors: Qiuyi Lyu, Bin Gong

Abstract:

Breadth-First Search(BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improve the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads. We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.

Keywords: breadth-first search, BFS, graph ordering, graph algorithm

Procedia PDF Downloads 138
18268 Oil Extraction from Sunflower Seed Using Green Solvent 2-Methyltetrahydrofuran and Isoamyl Alcohol

Authors: Sergio S. De Jesus, Aline Santana, Rubens Maciel Filho

Abstract:

The objective of this study was to choose and determine a green solvent system with similar extraction efficiencies as the traditional Bligh and Dyer method. Sunflower seed oil was extracted using Bligh and Dyer method with 2-methyltetrahydrofuran and isoamyl using alcohol ratios of 1:1; 2:1; 3:1; 1:2; 3:1. At the same time comparative experiments was performed with chloroform and methanol ratios of 1:1; 2:1; 3:1; 1:2; 3:1. Comparison study was done using 5 replicates (n=5). Statistical analysis was performed using Microsoft Office Excel (Microsoft, USA) to determine means and Tukey’s Honestly Significant Difference test for comparison between treatments (α = 0.05). The results showed that using classic method with methanol and chloroform presented the extraction oil yield with the values of 31-44% (w/w) and values of 36-45% (w/w) using green solvents for extractions. Among the two extraction methods, 2 methyltetrahydrofuran and isoamyl alcohol ratio 2:1 provided the best results (45% w/w), while the classic method using chloroform and methanol with ratio of 3:1 presented a extraction oil yield of 44% (w/w). It was concluded that the proposed extraction method using 2-methyltetrahydrofuran and isoamyl alcohol in this work allowed the same efficiency level as chloroform and methanol.

Keywords: extraction, green solvent, lipids, sugarcane

Procedia PDF Downloads 380
18267 Urban Flood Resilience Comprehensive Assessment of "720" Rainstorm in Zhengzhou Based on Multiple Factors

Authors: Meiyan Gao, Zongmin Wang, Haibo Yang, Qiuhua Liang

Abstract:

Under the background of global climate change and rapid development of modern urbanization, the frequency of climate disasters such as extreme precipitation in cities around the world is gradually increasing. In this paper, Hi-PIMS model is used to simulate the "720" flood in Zhengzhou, and the continuous stages of flood resilience are determined with the urban flood stages are divided. The flood resilience curve under the influence of multiple factors were determined and the urban flood toughness was evaluated by combining the results of resilience curves. The flood resilience of urban unit grid was evaluated based on economy, population, road network, hospital distribution and land use type. Firstly, the rainfall data of meteorological stations near Zhengzhou and the remote sensing rainfall data from July 17 to 22, 2021 were collected. The Kriging interpolation method was used to expand the rainfall data of Zhengzhou. According to the rainfall data, the flood process generated by four rainfall events in Zhengzhou was reproduced. Based on the results of the inundation range and inundation depth in different areas, the flood process was divided into four stages: absorption, resistance, overload and recovery based on the once in 50 years rainfall standard. At the same time, based on the levels of slope, GDP, population, hospital affected area, land use type, road network density and other aspects, the resilience curve was applied to evaluate the urban flood resilience of different regional units, and the difference of flood process of different precipitation in "720" rainstorm in Zhengzhou was analyzed. Faced with more than 1,000 years of rainstorm, most areas are quickly entering the stage of overload. The influence levels of factors in different areas are different, some areas with ramps or higher terrain have better resilience, and restore normal social order faster, that is, the recovery stage needs shorter time. Some low-lying areas or special terrain, such as tunnels, will enter the overload stage faster in the case of heavy rainfall. As a result, high levels of flood protection, water level warning systems and faster emergency response are needed in areas with low resilience and high risk. The building density of built-up area, population of densely populated area and road network density all have a certain negative impact on urban flood resistance, and the positive impact of slope on flood resilience is also very obvious. While hospitals can have positive effects on medical treatment, they also have negative effects such as population density and asset density when they encounter floods. The result of a separate comparison of the unit grid of hospitals shows that the resilience of hospitals in the distribution range is low when they encounter floods. Therefore, in addition to improving the flood resistance capacity of cities, through reasonable planning can also increase the flood response capacity of cities. Changes in these influencing factors can further improve urban flood resilience, such as raise design standards and the temporary water storage area when floods occur, train the response speed of emergency personnel and adjust emergency support equipment.

Keywords: urban flood resilience, resilience assessment, hydrodynamic model, resilience curve

Procedia PDF Downloads 40
18266 A Character Detection Method for Ancient Yi Books Based on Connected Components and Regressive Character Segmentation

Authors: Xu Han, Shanxiong Chen, Shiyu Zhu, Xiaoyu Lin, Fujia Zhao, Dingwang Wang

Abstract:

Character detection is an important issue for character recognition of ancient Yi books. The accuracy of detection directly affects the recognition effect of ancient Yi books. Considering the complex layout, the lack of standard typesetting and the mixed arrangement between images and texts, we propose a character detection method for ancient Yi books based on connected components and regressive character segmentation. First, the scanned images of ancient Yi books are preprocessed with nonlocal mean filtering, and then a modified local adaptive threshold binarization algorithm is used to obtain the binary images to segment the foreground and background for the images. Second, the non-text areas are removed by the method based on connected components. Finally, the single character in the ancient Yi books is segmented by our method. The experimental results show that the method can effectively separate the text areas and non-text areas for ancient Yi books and achieve higher accuracy and recall rate in the experiment of character detection, and effectively solve the problem of character detection and segmentation in character recognition of ancient books.

Keywords: CCS concepts, computing methodologies, interest point, salient region detections, image segmentation

Procedia PDF Downloads 132
18265 Investigate and Solving Analytically at Vibrational structures (In Arched Beam to Bridges) by New Method “AGM”

Authors: M. R. Akbari, P. Soleimani, R. Khalili, Sara Akbari

Abstract:

Analyzing and modeling the vibrational behavior of arched bridges during the earthquake in order to decrease the exerted damages to the structure is a very hard task to do. This item has been done analytically in the present paper for the first time. Due to the importance of building arched bridges as a great structure in the human being civilization and its specifications such as transferring vertical loads to its arcs and the lack of bending moments and shearing forces, this case study is devoted to this special issue. Here, the nonlinear vibration of arched bridges has been modeled and simulated by an arched beam with harmonic vertical loads and its behavior has been investigated by analyzing a nonlinear partial differential equation governing the system. It is notable that the procedure has been done analytically by AGM (Akbari, Ganji Method). Furthermore, comparisons have been made between the obtained results by numerical Method (rkf-45) and AGM in order to assess the scientific validity.

Keywords: new method (AGM), arched beam bridges, angular frequency, harmonic loads

Procedia PDF Downloads 297
18264 An Accelerated Stochastic Gradient Method with Momentum

Authors: Liang Liu, Xiaopeng Luo

Abstract:

In this paper, we propose an accelerated stochastic gradient method with momentum. The momentum term is the weighted average of generated gradients, and the weights decay inverse proportionally with the iteration times. Stochastic gradient descent with momentum (SGDM) uses weights that decay exponentially with the iteration times to generate the momentum term. Using exponential decay weights, variants of SGDM with inexplicable and complicated formats have been proposed to achieve better performance. However, the momentum update rules of our method are as simple as that of SGDM. We provide theoretical convergence analyses, which show both the exponential decay weights and our inverse proportional decay weights can limit the variance of the parameter moving directly to a region. Experimental results show that our method works well with many practical problems and outperforms SGDM.

Keywords: exponential decay rate weight, gradient descent, inverse proportional decay rate weight, momentum

Procedia PDF Downloads 162
18263 Nonuniformity Correction Technique in Infrared Video Using Feedback Recursive Least Square Algorithm

Authors: Flavio O. Torres, Maria J. Castilla, Rodrigo A. Augsburger, Pedro I. Cachana, Katherine S. Reyes

Abstract:

In this paper, we present a scene-based nonuniformity correction method using a modified recursive least square algorithm with a feedback system on the updates. The feedback is designed to remove impulsive noise contamination images produced by a recursive least square algorithm by measuring the output of the proposed algorithm. The key advantage of the method is based on its capacity to estimate detectors parameters and then compensate for impulsive noise contamination image in a frame by frame basics. We define the algorithm and present several experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published recursive least square-based methods. We show that the proposed method removes impulsive noise contamination image.

Keywords: infrared focal plane arrays, infrared imaging, least mean square, nonuniformity correction

Procedia PDF Downloads 143
18262 Failure Simulation of Small-scale Walls with Chases Using the Lattic Discrete Element Method

Authors: Karina C. Azzolin, Luis E. Kosteski, Alisson S. Milani, Raquel C. Zydeck

Abstract:

This work aims to represent Numerically tests experimentally developed in reduced scale walls with horizontal and inclined cuts by using the Lattice Discrete Element Method (LDEM) implemented On de Abaqus/explicit environment. The cuts were performed with depths of 20%, 30%, and 50% On the walls subjected to centered and eccentric loading. The parameters used to evaluate the numerical model are its strength, the failure mode, and the in-plane and out-of-plane displacements.

Keywords: structural masonry, wall chases, small scale, numerical model, lattice discrete element method

Procedia PDF Downloads 177
18261 BTEX (Benzene, Toluene, Ethylbenzene and Xylene) Degradation by Cold Plasma

Authors: Anelise Leal Vieira Cubas, Marina de Medeiros Machado, Marília de Medeiros Machado

Abstract:

The volatile organic compounds - BTEX (Benzene, Toluene, Ethylbenzene, and Xylene) petroleum derivatives, have high rates of toxicity, which may carry consequences for human health, biota and environment. In this direction, this paper proposes a method of treatment of these compounds by using corona discharge plasma technology. The efficiency of the method was tested by analyzing samples of BTEX after going through a plasma reactor by gas chromatography method. The results show that the optimal residence time of the sample in the reactor was 8 minutes.

Keywords: BTEX, degradation, cold plasma, ecological sciences

Procedia PDF Downloads 317
18260 Determine the Optimal Path of Content Adaptation Services with Max Heap Tree

Authors: Shilan Rahmani Azr, Siavash Emtiyaz

Abstract:

Recent development in computing and communicative technologies leads to much easier mobile accessibility to the information. Users can access to the information in different places using various deceives in which the care variety of abilities. Meanwhile, the format and details of electronic documents are changing each day. In these cases, a mismatch is created between content and client’s abilities. Recently the service-oriented content adaption has been developed which the adapting tasks are dedicated to some extended services. In this method, the main problem is to choose the best appropriate service among accessible and distributed services. In this paper, a method for determining the optimal path to the best services, based on the quality control parameters and user preferences, is proposed using max heap tree. The efficiency of this method in contrast to the other previous methods of the content adaptation is related to the determining the optimal path of the best services which are measured. The results show the advantages and progresses of this method in compare of the others.

Keywords: service-oriented content adaption, QoS, max heap tree, web services

Procedia PDF Downloads 259
18259 Investigation of the Effect of Excavation Step in NATM on Surface Settlement by Finite Element Method

Authors: Seyed Mehrdad Gholami

Abstract:

Nowadays, using rail transport system (Metro) is increased in most cities of The world, so the need for safe and economical way of building tunnels and subway stations is felt more and more. One of the most commonly used methods for constructing underground structures in urban areas is NATM (New Austrian tunneling method). In this method, there are some key parameters such as excavation steps and cross-sectional area that have a significant effect on the surface settlement. Settlement is a very important control factor related to safe excavation. In this paper, Finite Element Method is used by Abaqus. R6 station of Tehran Metro Line 6 is built by NATM and the construction of that is studied and analyzed. Considering the outcomes obtained from numerical modeling and comparison with the results of the instrumentation and monitoring of field, finally, the excavation step of 1 meter and longitudinal distance of 14 meters between side drifts is suggested to achieve safe tunneling with allowable settlement.

Keywords: excavation step, NATM, numerical modeling, settlement.

Procedia PDF Downloads 139
18258 Solving SPDEs by Least Squares Method

Authors: Hassan Manouzi

Abstract:

We present in this paper a useful strategy to solve stochastic partial differential equations (SPDEs) involving stochastic coefficients. Using the Wick-product of higher order and the Wiener-Itˆo chaos expansion, the SPDEs is reformulated as a large system of deterministic partial differential equations. To reduce the computational complexity of this system, we shall use a decomposition-coordination method. To obtain the chaos coefficients in the corresponding deterministic equations, we use a least square formulation. Once this approximation is performed, the statistics of the numerical solution can be easily evaluated.

Keywords: least squares, wick product, SPDEs, finite element, wiener chaos expansion, gradient method

Procedia PDF Downloads 419
18257 Ground Deformation Module for the New Laboratory Methods

Authors: O. Giorgishvili

Abstract:

For calculation of foundations one of the important characteristics is the module of deformation (E0). As we all know, the main goal of calculation of the foundations of buildings on deformation is to arrange the base settling and difference in settlings in such limits that do not cause origination of cracks and changes in design levels that will be dangerous to standard operation in the buildings and their individual structures. As is known from the literature and the practical application, the modulus of deformation is determined by two basic methods: laboratory method, soil test on compression (without the side widening) and soil test in field conditions. As we know, the deformation modulus of soil determined by field method is closer to the actual modulus deformation of soil, but the complexity of the tests to be carried out and the financial concerns did not allow determination of ground deformation modulus by field method. Therefore, we determine the ground modulus of deformation by compression method without side widening. Concerning this, we introduce a new way for determination of ground modulus of deformation by laboratory order that occurs by side widening and more accurately reflects the ground modulus of deformation and more accurately reflects the actual modulus of deformation and closer to the modulus of deformation determined by the field method. In this regard, we bring a new approach on the ground deformation detection laboratory module, which is done by widening sides. The tests and the results showed that the proposed method of ground deformation modulus is closer to the results that are obtained in the field, which reflects the foundation's work in real terms more accurately than the compression of the ground deformation module.

Keywords: build, deformation modulus, foundations, ground, laboratory research

Procedia PDF Downloads 368
18256 On the Accuracy of Basic Modal Displacement Method Considering Various Earthquakes

Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar

Abstract:

Time history seismic analysis is supposed to be the most accurate method to predict the seismic demand of structures. On the other hand, the required computational time of this method toward achieving the result is its main deficiency. While being applied in optimization process, in which the structure must be analyzed thousands of time, reducing the required computational time of seismic analysis of structures makes the optimization algorithms more practical. Apparently, the invented approximate methods produce some amount of errors in comparison with exact time history analysis but the recently proposed method namely, Complete Quadratic Combination (CQC) and Sum Root of the Sum of Squares (SRSS) drastically reduces the computational time by combination of peak responses in each mode. In the present research, the Basic Modal Displacement (BMD) method is introduced and applied towards estimation of seismic demand of main structure. Seismic demand of sampled structure is estimated by calculation of modal displacement of basic structure (in which the modal displacement has been calculated). Shear steel sampled structures are selected as case studies. The error applying the introduced method is calculated by comparison of the estimated seismic demands with exact time history dynamic analysis. The efficiency of the proposed method is demonstrated by application of three types of earthquakes (in view of time of peak ground acceleration).

Keywords: time history dynamic analysis, basic modal displacement, earthquake-induced demands, shear steel structures

Procedia PDF Downloads 355
18255 Descent Algorithms for Optimization Algorithms Using q-Derivative

Authors: Geetanjali Panda, Suvrakanti Chakraborty

Abstract:

In this paper, Newton-like descent methods are proposed for unconstrained optimization problems, which use q-derivatives of the gradient of an objective function. First, a local scheme is developed with alternative sufficient optimality condition, and then the method is extended to a global scheme. Moreover, a variant of practical Newton scheme is also developed introducing a real sequence. Global convergence of these schemes is proved under some mild conditions. Numerical experiments and graphical illustrations are provided. Finally, the performance profiles on a test set show that the proposed schemes are competitive to the existing first-order schemes for optimization problems.

Keywords: Descent algorithm, line search method, q calculus, Quasi Newton method

Procedia PDF Downloads 398
18254 End-to-End Pyramid Based Method for Magnetic Resonance Imaging Reconstruction

Authors: Omer Cahana, Ofer Levi, Maya Herman

Abstract:

Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.

Keywords: magnetic resonance imaging, image reconstruction, pyramid network, deep learning

Procedia PDF Downloads 91