Search results for: least squares method
18284 Vibration Analysis of Pendulum in a Viscous Fluid by Analytical Methods
Authors: Arash Jafari, Mehdi Taghaddosi, Azin Parvin
Abstract:
In this study, a vibrational differential equation governing on swinging single-degree-of-freedom pendulum in a viscous fluid has been investigated. The damping process is characterized according to two different regimes: at first, damping in stationary viscous fluid, in the second, damping in flowing viscous fluid with constant velocity. Our purpose is to enhance the ability of solving the mentioned nonlinear differential equation with a simple and innovative approach. Comparisons are made between new method and Numerical Method (rkf45). The results show that this method is very effective and simple and can be applied for other nonlinear problems.Keywords: oscillating systems, angular frequency and damping ratio, pendulum at fluid, locus of maximum
Procedia PDF Downloads 33718283 Investigation of the Effect of Teaching Thinking and Research Lesson by Cooperative and Traditional Methods on Creativity of Sixth Grade Students
Authors: Faroogh Khakzad, Marzieh Dehghani, Elahe Hejazi
Abstract:
The present study investigates the effect of teaching a Thinking and Research lesson by cooperative and traditional methods on the creativity of sixth-grade students in Piranshahr province. The statistical society includes all the sixth-grade students of Piranshahr province. The sample of this studytable was selected by available sampling from among male elementary schools of Piranshahr. They were randomly assigned into two groups of cooperative teaching method and traditional teaching method. The design of the study is quasi-experimental with a control group. In this study, to assess students’ creativity, Abedi’s creativity questionnaire was used. Based on Cronbach’s alpha coefficient, the reliability of the factor flow was 0.74, innovation was 0.61, flexibility was 0.63, and expansion was 0.68. To analyze the data, t-test, univariate and multivariate covariance analysis were used for evaluation of the difference of means and the pretest and posttest scores. The findings of the research showed that cooperative teaching method does not significantly increase creativity (p > 0.05). Moreover, cooperative teaching method was found to have significant effect on flow factor (p < 0.05), but in innovation and expansion factors no significant effect was observed (p < 0.05).Keywords: cooperative teaching method, traditional teaching method, creativity, flow, innovation, flexibility, expansion, thinking and research lesson
Procedia PDF Downloads 31618282 The Effect of Goal Setting on Psychological Status and Freestyle Swimming Performance in Young Competitive Swimmers
Authors: Sofiene Amara, Mohamed Ali Bahri, Sabri Gaied Chortane
Abstract:
The purpose of this study was to examine the effect of personal goal setting on psychological parameters (cognitive anxiety, somatic anxiety, and self-confidence) and the 50m freestyle performance. 30 young swimmers participated in this investigation, and was divided into three groups, the first group (G1, n = 10, 14 ± 0.7 years old) was prepared for the competition without a fixed target (method 1), the second group (G2, n = 10, 14 ± 0.9 years old) was oriented towards a vague goal 'Do your best' (method 2), while the third group (G3, n = 10, 14 ± 0, 5 years old) was invited to answer a goal that is difficult to reach according to a goal-setting interval (GST) (method 3). According to the statistical data of the present investigation, the cognitive and somatic anxiety scores in G1 and G3 were higher than in G2 (G1-G2, G3-G2: cognitive anxiety, P = 0.000, somatic anxiety: P = 0.000 respectively). On the other hand, the self-confidence score was lower in G1 compared with the other two groups (G1-G2, G3-G2: P = 0.02, P = 0.03 respectively). Our assessment also shows that the 50m freestyle time performance was improved better by method 3 (pre and post-Test: P = 0.006, -2.5sec, 7.83%), than by method 2 (pre and Post-Test: P = 0.03; -1sec; 3.24%), while, performance remained unchanged in G1 (P > 0.05). To conclude, the setting of a difficult goal by GST is more effective to improve the chronometric performance in the 50m freestyle, but at the same time increased the values of the cognitive and somatic anxiety. For this, the mental trainers and the staff technical, invited to develop models of mental preparation associated with this method of setting a goal to help swimmers on the psychological level.Keywords: cognitive anxiety, goal setting, performance of swimming freestyle, self-confidence, somatic anxiety
Procedia PDF Downloads 12918281 Exact Solutions for Steady Response of Nonlinear Systems under Non-White Excitation
Authors: Yaping Zhao
Abstract:
In the present study, the exact solutions for the steady response of quasi-linear systems under non-white wide-band random excitation are considered by means of the stochastic averaging method. The non linearity of the systems contains the power-law damping and the cross-product term of the power-law damping and displacement. The drift and diffusion coefficients of the Fokker-Planck-Kolmogorov (FPK) equation after averaging are obtained by a succinct approach. After solving the averaged FPK equation, the joint probability density function and the marginal probability density function in steady state are attained. In the process of resolving, the eigenvalue problem of ordinary differential equation is handled by integral equation method. Some new results are acquired and the novel method to deal with the problems in nonlinear random vibration is proposed.Keywords: random vibration, stochastic averaging method, FPK equation, transition probability density
Procedia PDF Downloads 50318280 The Effect of Conservative Tillage on Physical Properties of Soil and Yield of Rainfed Wheat
Authors: Abolfazl Hedayatipoor, Mohammad Younesi Alamooti
Abstract:
In order to study the effect of conservative tillage on a number of physical properties of soil and the yield of rainfed wheat, an experiment in the form of a randomized complete block design (RCBD) with three replications was conducted in a field in Aliabad County, Iran. The study treatments included: T1) Conventional method, T2) Combined moldboard plow method, T3) Chisel-packer method, and T4) Direct planting method. During early October, the study soil was prepared based on these treatments in a field which was used for rainfed wheat farming in the previous year. The apparent specific gravity of soil, weighted mean diameter (WMD) of soil aggregates, soil mechanical resistance, and soil permeability were measured. Data were analyzed in MSTAT-C. Results showed that the tillage practice had no significant effect on grain yield (p < 0.05). Soil permeability was 10.9, 16.3, 15.7 and 17.9 mm/h for T1, T2, T3 and T4, respectively.Keywords: rainfed agriculture, conservative tillage, energy consumption, wheat
Procedia PDF Downloads 20618279 Quality by Design in the Optimization of a Fast HPLC Method for Quantification of Hydroxychloroquine Sulfate
Authors: Pedro J. Rolim-Neto, Leslie R. M. Ferraz, Fabiana L. A. Santos, Pablo A. Ferreira, Ricardo T. L. Maia-Jr., Magaly A. M. Lyra, Danilo A F. Fonte, Salvana P. M. Costa, Amanda C. Q. M. Vieira, Larissa A. Rolim
Abstract:
Initially developed as an antimalarial agent, hydroxychloroquine (HCQ) sulfate is often used as a slow-acting antirheumatic drug in the treatment of disorders of connective tissue. The United States Pharmacopeia (USP) 37 provides a reversed-phase HPLC method for quantification of HCQ. However, this method was not reproducible, producing asymmetric peaks in a long analysis time. The asymmetry of the peak may cause an incorrect calculation of the concentration of the sample. Furthermore, the analysis time is unacceptable, especially regarding the routine of a pharmaceutical industry. The aiming of this study was to develop a fast, easy and efficient method for quantification of HCQ sulfate by High Performance Liquid Chromatography (HPLC) based on the Quality by Design (QbD) methodology. This method was optimized in terms of peak symmetry using the surface area graphic as the Design of Experiments (DoE) and the tailing factor (TF) as an indicator to the Design Space (DS). The reference method used was that described at USP 37 to the quantification of the drug. For the optimized method, was proposed a 33 factorial design, based on the QbD concepts. The DS was created with the TF (in a range between 0.98 and 1.2) in order to demonstrate the ideal analytical conditions. Changes were made in the composition of the USP mobile-phase (USP-MP): USP-MP: Methanol (90:10 v/v, 80:20 v/v and 70:30 v/v), in the flow (0.8, 1.0 and 1.2 mL) and in the oven temperature (30, 35, and 40ºC). The USP method allowed the quantification of drug in a long time (40-50 minutes). In addition, the method uses a high flow rate (1,5 mL.min-1) which increases the consumption of expensive solvents HPLC grade. The main problem observed was the TF value (1,8) that would be accepted if the drug was not a racemic mixture, since the co-elution of the isomers can become an unreliable peak integration. Therefore, the optimization was suggested in order to reduce the analysis time, aiming a better peak resolution and TF. For the optimization method, by the analysis of the surface-response plot it was possible to confirm the ideal setting analytical condition: 45 °C, 0,8 mL.min-1 and 80:20 USP-MP: Methanol. The optimized HPLC method enabled the quantification of HCQ sulfate, with a peak of high resolution, showing a TF value of 1,17. This promotes good co-elution of isomers of the HCQ, ensuring an accurate quantification of the raw material as racemic mixture. This method also proved to be 18 times faster, approximately, compared to the reference method, using a lower flow rate, reducing even more the consumption of the solvents and, consequently, the analysis cost. Thus, an analytical method for the quantification of HCQ sulfate was optimized using QbD methodology. This method proved to be faster and more efficient than the USP method, regarding the retention time and, especially, the peak resolution. The higher resolution in the chromatogram peaks supports the implementation of the method for quantification of the drug as racemic mixture, not requiring the separation of isomers.Keywords: analytical method, hydroxychloroquine sulfate, quality by design, surface area graphic
Procedia PDF Downloads 63918278 Development of In Situ Permeability Test Using Constant Discharge Method for Sandy Soils
Authors: A. Rifa’i, Y. Takeshita, M. Komatsu
Abstract:
The post-rain puddles problem that occurs in the first yard of Prambanan Temple are often disturbing visitor activity. A poodle layer and a drainage system has ever built to avoid such a problem, but puddles still didn’t stop appearing after rain. Permeability parameter needs to be determined by using more simple procedure to find exact method of solution. The instrument modelling were proposed according to the development of field permeability testing instrument. This experiment used proposed Constant Discharge method. Constant Discharge method used a tube poured with constant water flow. The procedure were carried out from unsaturated until saturated soil condition. Volumetric water content (θ) were being monitored by soil moisture measurement device. The results were relationship between k and θ which drawn by numerical approach Van Genutchen model. Parameters θr optimum value obtained from the test was at very dry soil. Coefficient of permeability with a density of 19.8 kN/m3 for unsaturated conditions was in range of 3 x 10-6 cm/sec (Sr= 68 %) until 9.98 x 10-4 cm/sec (Sr= 82 %). The equipment and testing procedure developed in this research was quite effective, simple and easy to be implemented on determining field soil permeability coefficient value of sandy soil. Using constant discharge method in proposed permeability test, value of permeability coefficient under unsaturated condition can be obtained without establish soil water characteristic curve.Keywords: constant discharge method, in situ permeability test, sandy soil, unsaturated conditions
Procedia PDF Downloads 38418277 Numerical Modelling of Dry Stone Masonry Structures Based on Finite-Discrete Element Method
Authors: Ž. Nikolić, H. Smoljanović, N. Živaljić
Abstract:
This paper presents numerical model based on finite-discrete element method for analysis of the structural response of dry stone masonry structures under static and dynamic loads. More precisely, each discrete stone block is discretized by finite elements. Material non-linearity including fracture and fragmentation of discrete elements as well as cyclic behavior during dynamic load are considered through contact elements which are implemented within a finite element mesh. The application of the model was conducted on several examples of these structures. The performed analysis shows high accuracy of the numerical results in comparison with the experimental ones and demonstrates the potential of the finite-discrete element method for modelling of the response of dry stone masonry structures.Keywords: dry stone masonry structures, dynamic load, finite-discrete element method, static load
Procedia PDF Downloads 41418276 Combining the Fictitious Stress Method and Displacement Discontinuity Method in Solving Crack Problems in Anisotropic Material
Authors: Bahatti̇n Ki̇mençe, Uğur Ki̇mençe
Abstract:
In this study, the purpose of obtaining the influence functions of the displacement discontinuity in an anisotropic elastic medium is to produce the boundary element equations. A Displacement Discontinuous Method formulation (DDM) is presented with the aim of modeling two-dimensional elastic fracture problems. This formulation is found by analytical integration of the fundamental solution along a straight-line crack. With this purpose, Kelvin's fundamental solutions for anisotropic media on an infinite plane are used to form dipoles from singular loads, and the various combinations of the said dipoles are used to obtain the influence functions of displacement discontinuity. This study introduces a technique for coupling Fictitious Stress Method (FSM) and DDM; the reason for applying this technique to some examples is to demonstrate the effectiveness of the proposed coupling method. In this study, displacement discontinuity equations are obtained by using dipole solutions calculated with known singular force solutions in an anisotropic medium. The displacement discontinuities method obtained from the solutions of these equations and the fictitious stress methods is combined and compared with various examples. In this study, one or more crack problems with various geometries in rectangular plates in finite and infinite regions, under the effect of tensile stress with coupled FSM and DDM in the anisotropic environment, were examined, and the effectiveness of the coupled method was demonstrated. Since crack problems can be modeled more easily with DDM, it has been observed that the use of DDM has increased recently. In obtaining the displacement discontinuity equations, Papkovitch functions were used in Crouch, and harmonic functions were chosen to satisfy various boundary conditions. A comparison is made between two indirect boundary element formulations, DDM, and an extension of FSM, for solving problems involving cracks. Several numerical examples are presented, and the outcomes are contrasted to existing analytical or reference outs.Keywords: displacement discontinuity method, fictitious stress method, crack problems, anisotropic material
Procedia PDF Downloads 7518275 A Novel Combination Method for Computing the Importance Map of Image
Authors: Ahmad Absetan, Mahdi Nooshyar
Abstract:
The importance map is an image-based measure and is a core part of the resizing algorithm. Importance measures include image gradients, saliency and entropy, as well as high level cues such as face detectors, motion detectors and more. In this work we proposed a new method to calculate the importance map, the importance map is generated automatically using a novel combination of image edge density and Harel saliency measurement. Experiments of different type images demonstrate that our method effectively detects prominent areas can be used in image resizing applications to aware important areas while preserving image quality.Keywords: content-aware image resizing, visual saliency, edge density, image warping
Procedia PDF Downloads 58218274 Speedup Breadth-First Search by Graph Ordering
Abstract:
Breadth-First Search(BFS) is a core graph algorithm that is widely used for graph analysis. As it is frequently used in many graph applications, improve the BFS performance is essential. In this paper, we present a graph ordering method that could reorder the graph nodes to achieve better data locality, thus, improving the BFS performance. Our method is based on an observation that the sibling relationships will dominate the cache access pattern during the BFS traversal. Therefore, we propose a frequency-based model to construct the graph order. First, we optimize the graph order according to the nodes’ visit frequency. Nodes with high visit frequency will be processed in priority. Second, we try to maximize the child nodes overlap layer by layer. As it is proved to be NP-hard, we propose a heuristic method that could greatly reduce the preprocessing overheads. We conduct extensive experiments on 16 real-world datasets. The result shows that our method could achieve comparable performance with the state-of-the-art methods while the graph ordering overheads are only about 1/15.Keywords: breadth-first search, BFS, graph ordering, graph algorithm
Procedia PDF Downloads 13818273 Oil Extraction from Sunflower Seed Using Green Solvent 2-Methyltetrahydrofuran and Isoamyl Alcohol
Authors: Sergio S. De Jesus, Aline Santana, Rubens Maciel Filho
Abstract:
The objective of this study was to choose and determine a green solvent system with similar extraction efficiencies as the traditional Bligh and Dyer method. Sunflower seed oil was extracted using Bligh and Dyer method with 2-methyltetrahydrofuran and isoamyl using alcohol ratios of 1:1; 2:1; 3:1; 1:2; 3:1. At the same time comparative experiments was performed with chloroform and methanol ratios of 1:1; 2:1; 3:1; 1:2; 3:1. Comparison study was done using 5 replicates (n=5). Statistical analysis was performed using Microsoft Office Excel (Microsoft, USA) to determine means and Tukey’s Honestly Significant Difference test for comparison between treatments (α = 0.05). The results showed that using classic method with methanol and chloroform presented the extraction oil yield with the values of 31-44% (w/w) and values of 36-45% (w/w) using green solvents for extractions. Among the two extraction methods, 2 methyltetrahydrofuran and isoamyl alcohol ratio 2:1 provided the best results (45% w/w), while the classic method using chloroform and methanol with ratio of 3:1 presented a extraction oil yield of 44% (w/w). It was concluded that the proposed extraction method using 2-methyltetrahydrofuran and isoamyl alcohol in this work allowed the same efficiency level as chloroform and methanol.Keywords: extraction, green solvent, lipids, sugarcane
Procedia PDF Downloads 38118272 Relationship of Entrepreneurial Ecosystem Factors and Entrepreneurial Cognition: An Exploratory Study Applied to Regional and Metropolitan Ecosystems in New South Wales, Australia
Authors: Sumedha Weerasekara, Morgan Miles, Mark Morrison, Branka Krivokapic-Skoko
Abstract:
This paper is aimed at exploring the interrelationships among entrepreneurial ecosystem factors and entrepreneurial cognition in regional and metropolitan ecosystems. Entrepreneurial ecosystem factors examined include: culture, infrastructure, access to finance, informal networks, support services, access to universities, and the depth and breadth of the talent pool. Using a multivariate approach we explore the impact of these ecosystem factors or elements on entrepreneurial cognition. In doing so, the existing body of knowledge from the literature on entrepreneurial ecosystem and cognition have been blended to explore the relationship between entrepreneurial ecosystem factors and cognition in a way not hitherto investigated. The concept of the entrepreneurial ecosystem has received increased attention as governments, universities and communities have started to recognize the potential of integrated policies, structures, programs and processes that foster entrepreneurship activities by supporting innovation, productivity and employment growth. The notion of entrepreneurial ecosystems has evolved and grown with the advancement of theoretical research and empirical studies. Importance of incorporating external factors like culture, political environment, and the economic environment within a single framework will enhance the capacity of examining the whole systems functionality to better understand the interaction of the entrepreneurial actors and factors within a single framework. The literature on clusters underplays the role of entrepreneurs and entrepreneurial management in creating and co-creating organizations, markets, and supporting ecosystems. Entrepreneurs are only one actor following a limited set of roles and dependent upon many other factors to thrive. As a consequence, entrepreneurs and relevant authorities should be aware of the other actors and factors with which they engage and rely, and make strategic choices to achieve both self and also collective objectives. The study uses stratified random sampling method to collect survey data from 12 different regions in regional and metropolitan regions of NSW, Australia. A questionnaire was administered online among 512 Small and medium enterprise owners operating their business in selected 12 regions in NSW, Australia. Data were analyzed using descriptive analyzing techniques and partial least squares - structural equation modeling. The findings show that even though there is a significant relationship between each and every entrepreneurial ecosystem factors, there is a weak relationship between most entrepreneurial ecosystem factors and entrepreneurial cognition. In the metropolitan context, the availability of finance and informal networks have the largest impact on entrepreneurial cognition while culture, infrastructure, and support services having the smallest impact and the talent pool and universities having a moderate impact on entrepreneurial cognition. Interestingly, in a regional context, culture, availability of finance, and the talent pool have the highest impact on entrepreneurial cognition, while informal networks having the smallest impact and the remaining factors – infrastructure, universities, and support services have a moderate impact on entrepreneurial cognition. These findings suggest the need for a location-specific strategy for supporting the development of entrepreneurial cognition.Keywords: academic achievement, colour response card, feedback
Procedia PDF Downloads 14318271 A Character Detection Method for Ancient Yi Books Based on Connected Components and Regressive Character Segmentation
Authors: Xu Han, Shanxiong Chen, Shiyu Zhu, Xiaoyu Lin, Fujia Zhao, Dingwang Wang
Abstract:
Character detection is an important issue for character recognition of ancient Yi books. The accuracy of detection directly affects the recognition effect of ancient Yi books. Considering the complex layout, the lack of standard typesetting and the mixed arrangement between images and texts, we propose a character detection method for ancient Yi books based on connected components and regressive character segmentation. First, the scanned images of ancient Yi books are preprocessed with nonlocal mean filtering, and then a modified local adaptive threshold binarization algorithm is used to obtain the binary images to segment the foreground and background for the images. Second, the non-text areas are removed by the method based on connected components. Finally, the single character in the ancient Yi books is segmented by our method. The experimental results show that the method can effectively separate the text areas and non-text areas for ancient Yi books and achieve higher accuracy and recall rate in the experiment of character detection, and effectively solve the problem of character detection and segmentation in character recognition of ancient books.Keywords: CCS concepts, computing methodologies, interest point, salient region detections, image segmentation
Procedia PDF Downloads 13218270 Investigate and Solving Analytically at Vibrational structures (In Arched Beam to Bridges) by New Method “AGM”
Authors: M. R. Akbari, P. Soleimani, R. Khalili, Sara Akbari
Abstract:
Analyzing and modeling the vibrational behavior of arched bridges during the earthquake in order to decrease the exerted damages to the structure is a very hard task to do. This item has been done analytically in the present paper for the first time. Due to the importance of building arched bridges as a great structure in the human being civilization and its specifications such as transferring vertical loads to its arcs and the lack of bending moments and shearing forces, this case study is devoted to this special issue. Here, the nonlinear vibration of arched bridges has been modeled and simulated by an arched beam with harmonic vertical loads and its behavior has been investigated by analyzing a nonlinear partial differential equation governing the system. It is notable that the procedure has been done analytically by AGM (Akbari, Ganji Method). Furthermore, comparisons have been made between the obtained results by numerical Method (rkf-45) and AGM in order to assess the scientific validity.Keywords: new method (AGM), arched beam bridges, angular frequency, harmonic loads
Procedia PDF Downloads 29718269 An Accelerated Stochastic Gradient Method with Momentum
Authors: Liang Liu, Xiaopeng Luo
Abstract:
In this paper, we propose an accelerated stochastic gradient method with momentum. The momentum term is the weighted average of generated gradients, and the weights decay inverse proportionally with the iteration times. Stochastic gradient descent with momentum (SGDM) uses weights that decay exponentially with the iteration times to generate the momentum term. Using exponential decay weights, variants of SGDM with inexplicable and complicated formats have been proposed to achieve better performance. However, the momentum update rules of our method are as simple as that of SGDM. We provide theoretical convergence analyses, which show both the exponential decay weights and our inverse proportional decay weights can limit the variance of the parameter moving directly to a region. Experimental results show that our method works well with many practical problems and outperforms SGDM.Keywords: exponential decay rate weight, gradient descent, inverse proportional decay rate weight, momentum
Procedia PDF Downloads 16218268 Nonuniformity Correction Technique in Infrared Video Using Feedback Recursive Least Square Algorithm
Authors: Flavio O. Torres, Maria J. Castilla, Rodrigo A. Augsburger, Pedro I. Cachana, Katherine S. Reyes
Abstract:
In this paper, we present a scene-based nonuniformity correction method using a modified recursive least square algorithm with a feedback system on the updates. The feedback is designed to remove impulsive noise contamination images produced by a recursive least square algorithm by measuring the output of the proposed algorithm. The key advantage of the method is based on its capacity to estimate detectors parameters and then compensate for impulsive noise contamination image in a frame by frame basics. We define the algorithm and present several experimental results to demonstrate the efficacy of the proposed method in comparison to several previously published recursive least square-based methods. We show that the proposed method removes impulsive noise contamination image.Keywords: infrared focal plane arrays, infrared imaging, least mean square, nonuniformity correction
Procedia PDF Downloads 14318267 Failure Simulation of Small-scale Walls with Chases Using the Lattic Discrete Element Method
Authors: Karina C. Azzolin, Luis E. Kosteski, Alisson S. Milani, Raquel C. Zydeck
Abstract:
This work aims to represent Numerically tests experimentally developed in reduced scale walls with horizontal and inclined cuts by using the Lattice Discrete Element Method (LDEM) implemented On de Abaqus/explicit environment. The cuts were performed with depths of 20%, 30%, and 50% On the walls subjected to centered and eccentric loading. The parameters used to evaluate the numerical model are its strength, the failure mode, and the in-plane and out-of-plane displacements.Keywords: structural masonry, wall chases, small scale, numerical model, lattice discrete element method
Procedia PDF Downloads 17718266 BTEX (Benzene, Toluene, Ethylbenzene and Xylene) Degradation by Cold Plasma
Authors: Anelise Leal Vieira Cubas, Marina de Medeiros Machado, Marília de Medeiros Machado
Abstract:
The volatile organic compounds - BTEX (Benzene, Toluene, Ethylbenzene, and Xylene) petroleum derivatives, have high rates of toxicity, which may carry consequences for human health, biota and environment. In this direction, this paper proposes a method of treatment of these compounds by using corona discharge plasma technology. The efficiency of the method was tested by analyzing samples of BTEX after going through a plasma reactor by gas chromatography method. The results show that the optimal residence time of the sample in the reactor was 8 minutes.Keywords: BTEX, degradation, cold plasma, ecological sciences
Procedia PDF Downloads 31718265 Determine the Optimal Path of Content Adaptation Services with Max Heap Tree
Authors: Shilan Rahmani Azr, Siavash Emtiyaz
Abstract:
Recent development in computing and communicative technologies leads to much easier mobile accessibility to the information. Users can access to the information in different places using various deceives in which the care variety of abilities. Meanwhile, the format and details of electronic documents are changing each day. In these cases, a mismatch is created between content and client’s abilities. Recently the service-oriented content adaption has been developed which the adapting tasks are dedicated to some extended services. In this method, the main problem is to choose the best appropriate service among accessible and distributed services. In this paper, a method for determining the optimal path to the best services, based on the quality control parameters and user preferences, is proposed using max heap tree. The efficiency of this method in contrast to the other previous methods of the content adaptation is related to the determining the optimal path of the best services which are measured. The results show the advantages and progresses of this method in compare of the others.Keywords: service-oriented content adaption, QoS, max heap tree, web services
Procedia PDF Downloads 25918264 Investigation of the Effect of Excavation Step in NATM on Surface Settlement by Finite Element Method
Authors: Seyed Mehrdad Gholami
Abstract:
Nowadays, using rail transport system (Metro) is increased in most cities of The world, so the need for safe and economical way of building tunnels and subway stations is felt more and more. One of the most commonly used methods for constructing underground structures in urban areas is NATM (New Austrian tunneling method). In this method, there are some key parameters such as excavation steps and cross-sectional area that have a significant effect on the surface settlement. Settlement is a very important control factor related to safe excavation. In this paper, Finite Element Method is used by Abaqus. R6 station of Tehran Metro Line 6 is built by NATM and the construction of that is studied and analyzed. Considering the outcomes obtained from numerical modeling and comparison with the results of the instrumentation and monitoring of field, finally, the excavation step of 1 meter and longitudinal distance of 14 meters between side drifts is suggested to achieve safe tunneling with allowable settlement.Keywords: excavation step, NATM, numerical modeling, settlement.
Procedia PDF Downloads 13918263 Ground Deformation Module for the New Laboratory Methods
Authors: O. Giorgishvili
Abstract:
For calculation of foundations one of the important characteristics is the module of deformation (E0). As we all know, the main goal of calculation of the foundations of buildings on deformation is to arrange the base settling and difference in settlings in such limits that do not cause origination of cracks and changes in design levels that will be dangerous to standard operation in the buildings and their individual structures. As is known from the literature and the practical application, the modulus of deformation is determined by two basic methods: laboratory method, soil test on compression (without the side widening) and soil test in field conditions. As we know, the deformation modulus of soil determined by field method is closer to the actual modulus deformation of soil, but the complexity of the tests to be carried out and the financial concerns did not allow determination of ground deformation modulus by field method. Therefore, we determine the ground modulus of deformation by compression method without side widening. Concerning this, we introduce a new way for determination of ground modulus of deformation by laboratory order that occurs by side widening and more accurately reflects the ground modulus of deformation and more accurately reflects the actual modulus of deformation and closer to the modulus of deformation determined by the field method. In this regard, we bring a new approach on the ground deformation detection laboratory module, which is done by widening sides. The tests and the results showed that the proposed method of ground deformation modulus is closer to the results that are obtained in the field, which reflects the foundation's work in real terms more accurately than the compression of the ground deformation module.Keywords: build, deformation modulus, foundations, ground, laboratory research
Procedia PDF Downloads 36818262 Descent Algorithms for Optimization Algorithms Using q-Derivative
Authors: Geetanjali Panda, Suvrakanti Chakraborty
Abstract:
In this paper, Newton-like descent methods are proposed for unconstrained optimization problems, which use q-derivatives of the gradient of an objective function. First, a local scheme is developed with alternative sufficient optimality condition, and then the method is extended to a global scheme. Moreover, a variant of practical Newton scheme is also developed introducing a real sequence. Global convergence of these schemes is proved under some mild conditions. Numerical experiments and graphical illustrations are provided. Finally, the performance profiles on a test set show that the proposed schemes are competitive to the existing first-order schemes for optimization problems.Keywords: Descent algorithm, line search method, q calculus, Quasi Newton method
Procedia PDF Downloads 39818261 End-to-End Pyramid Based Method for Magnetic Resonance Imaging Reconstruction
Authors: Omer Cahana, Ofer Levi, Maya Herman
Abstract:
Magnetic Resonance Imaging (MRI) is a lengthy medical scan that stems from a long acquisition time. Its length is mainly due to the traditional sampling theorem, which defines a lower boundary for sampling. However, it is still possible to accelerate the scan by using a different approach such as Compress Sensing (CS) or Parallel Imaging (PI). These two complementary methods can be combined to achieve a faster scan with high-fidelity imaging. To achieve that, two conditions must be satisfied: i) the signal must be sparse under a known transform domain, and ii) the sampling method must be incoherent. In addition, a nonlinear reconstruction algorithm must be applied to recover the signal. While the rapid advances in Deep Learning (DL) have had tremendous successes in various computer vision tasks, the field of MRI reconstruction is still in its early stages. In this paper, we present an end-to-end method for MRI reconstruction from k-space to image. Our method contains two parts. The first is sensitivity map estimation (SME), which is a small yet effective network that can easily be extended to a variable number of coils. The second is reconstruction, which is a top-down architecture with lateral connections developed for building high-level refinement at all scales. Our method holds the state-of-art fastMRI benchmark, which is the largest, most diverse benchmark for MRI reconstruction.Keywords: magnetic resonance imaging, image reconstruction, pyramid network, deep learning
Procedia PDF Downloads 9118260 An Event Relationship Extraction Method Incorporating Deep Feedback Recurrent Neural Network and Bidirectional Long Short-Term Memory
Authors: Yin Yuanling
Abstract:
A Deep Feedback Recurrent Neural Network (DFRNN) and Bidirectional Long Short-Term Memory (BiLSTM) are designed to address the problem of low accuracy of traditional relationship extraction models. This method combines a deep feedback-based recurrent neural network (DFRNN) with a bi-directional long short-term memory (BiLSTM) approach. The method combines DFRNN, which extracts local features of text based on deep feedback recurrent mechanism, BiLSTM, which better extracts global features of text, and Self-Attention, which extracts semantic information. Experiments show that the method achieves an F1 value of 76.69% on the CEC dataset, which is 0.0652 better than the BiLSTM+Self-ATT model, thus optimizing the performance of the deep learning method in the event relationship extraction task.Keywords: event relations, deep learning, DFRNN models, bi-directional long and short-term memory networks
Procedia PDF Downloads 14418259 Bi-Directional Evolutionary Topology Optimization Based on Critical Fatigue Constraint
Authors: Khodamorad Nabaki, Jianhu Shen, Xiaodong Huang
Abstract:
This paper develops a method for considering the critical fatigue stress as a constraint in the Bi-directional Evolutionary Structural Optimization (BESO) method. Our aim is to reach an optimal design in which high cycle fatigue failure does not occur for a specific life time. The critical fatigue stress is calculated based on modified Goodman criteria and used as a stress constraint in our topology optimization problem. Since fatigue generally does not occur for compressive stresses, we use the p-norm approach of the stress measurement that considers the highest tensile principal stress in each point as stress measure to calculate the sensitivity numbers. The BESO method has been extended to minimize volume an object subjected to the critical fatigue stress constraint. The optimization results are compared with the results from the compliance minimization problem which shows clearly the merits of our newly developed approach.Keywords: topology optimization, BESO method, p-norm, fatigue constraint
Procedia PDF Downloads 29518258 Study of Natural Convection Heat Transfer of Plate-Fin Heat Sink
Authors: Han-Taw Chen, Tzu-Hsiang Lin, Chung-Hou Lai
Abstract:
This study applies the inverse method and three-dimensional CFD commercial software in conjunction with the experimental temperature data to investigate the heat transfer and fluid flow characteristics of the plate-fin heat sink in a rectangular closed enclosure. The inverse method with the finite difference method and the experimental temperature data is applied to determine the approximate heat transfer coefficient. Later, based on the obtained results, the zero-equation turbulence model is used to obtain the heat transfer and fluid flow characteristics between two fins. To validate the accuracy of the results obtained, the comparison of the heat transfer coefficient is made. The obtained temperature at selected measurement locations of the fin is also compared with experimental data. The effect of the height of the rectangular enclosure on the obtained results is discussed.Keywords: inverse method, fluent, heat transfer characteristics, plate-fin heat sink
Procedia PDF Downloads 38918257 The Role of Two Macrophyte Species in Mineral Nutrient Cycling in Human-Impacted Water Reservoirs
Authors: Ludmila Polechonska, Agnieszka Klink
Abstract:
The biogeochemical studies of macrophytes shed light on elements bioavailability, transfer through the food webs and their possible effects on the biota, and provide a basis for their practical application in aquatic monitoring and remediation. Measuring the accumulation of elements in plants can provide time-integrated information about the presence of chemicals in aquatic ecosystems. The aim of the study was to determine and compare the contents of micro- and macroelements in two cosmopolitan macrophytes, submerged Ceratophyllum demersum (hornworth) and free-floating Hydrocharis morsus-ranae (European frog-bit), in order to assess their bioaccumulation potential, elements stock accumulated in each plant and their role in nutrients cycling in small water reservoirs. Sampling sites were designated in 25 oxbow lakes in urban areas in Lower Silesia (SW Poland). In each sampling site, fresh whole plants of C. demersum and H. morsus-ranae were collected from squares of 1x1 meters each where the species coexisted. European frog-bit was separated into leaves, stems and roots. For biomass measurement all plants growing on 1 square meter were collected, dried and weighed. At the same time, water samples were collected from each reservoir and their pH and EC were determined. Water samples were filtered and acidified and plant samples were digested in concentrated nitric acid. Next, the content of Ca, Cu, Fe, K, Mg, Mn, Ni and Zn was determined using atomic absorption method (AAS). Statistical analysis showed that C. demersum and organs of H. morsus-ranae differed significantly in respect of metals content (Kruskal-Wallis Anova, p<0.05). Contents of Cu, Mn, Ni and Zn were higher in hornwort, while European frog-bit contained more Ca, Fe, K, Mg. Bioaccumulation Factors (BCF=content in plant/concentration in water) showed similar pattern of metal bioaccumulation – microelements were more intensively accumulated by hornwort and macroelements by frog-bit. Based on BCF values both species may be positively evaluated as good accumulators of Cu, Fe, Mn, Ni and Zn. However, the distribution of metals in H. morsus-ranae was uneven – the majority of studied elements were retained in roots, which may indicate to existence of physiological barriers developed for dealing with toxicity. Some percent of Ca and K was actively transported to stems, but to leaves Mg only. Although the biomass of C. demersum was two times greater than biomass of H. morsus-ranae, the element off-take was greater only for Cu, Mn, Ni and Zn. Nevertheless, it can be stated that despite a relatively small biomass, compared to other macrophytes, both species may have an influence on the removal of trace elements from aquatic ecosystems and, as they serve as food for some animals, also on the incorporation of toxic elements into food chains. There was a significant positive correlation between content of Mn and Fe in water and roots of H. morus-ranae (R=0.51 and R=0.60, respectively) as well as between Cu concentration in water and in C. demersum (R=0.41) (Spearman rank correlation, p<0.05). High bioaccumulation rates and correlation between plants and water elements concentrations point to their possible use as passive biomonitors of aquatic pollution.Keywords: aquatic plants, bioaccumulation, biomonitoring, macroelements, phytoremediation, trace metals
Procedia PDF Downloads 18918256 Bioanalytical Method Development and Validation of Aminophylline in Rat Plasma Using Reverse Phase High Performance Liquid Chromatography: An Application to Preclinical Pharmacokinetics
Authors: S. G. Vasantharaju, Viswanath Guptha, Raghavendra Shetty
Abstract:
Introduction: Aminophylline is a methylxanthine derivative belonging to the class bronchodilator. From the literature survey, reported methods reveals the solid phase extraction and liquid liquid extraction which is highly variable, time consuming, costly and laborious analysis. Present work aims to develop a simple, highly sensitive, precise and accurate high-performance liquid chromatography method for the quantification of Aminophylline in rat plasma samples which can be utilized for preclinical studies. Method: Reverse Phase high-performance liquid chromatography method. Results: Selectivity: Aminophylline and the internal standard were well separated from the co-eluted components and there was no interference from the endogenous material at the retention time of analyte and the internal standard. The LLOQ measurable with acceptable accuracy and precision for the analyte was 0.5 µg/mL. Linearity: The developed and validated method is linear over the range of 0.5-40.0 µg/mL. The coefficient of determination was found to be greater than 0.9967, indicating the linearity of this method. Accuracy and precision: The accuracy and precision values for intra and inter day studies at low, medium and high quality control samples concentrations of aminophylline in the plasma were within the acceptable limits Extraction recovery: The method produced consistent extraction recovery at all 3 QC levels. The mean extraction recovery of aminophylline was 93.57 ± 1.28% while that of internal standard was 90.70 ± 1.30%. Stability: The results show that aminophylline is stable in rat plasma under the studied stability conditions and that it is also stable for about 30 days when stored at -80˚C. Pharmacokinetic studies: The method was successfully applied to the quantitative estimation of aminophylline rat plasma following its oral administration to rats. Discussion: Preclinical studies require a rapid and sensitive method for estimating the drug concentration in the rat plasma. The method described in our article includes a simple protein precipitation extraction technique with ultraviolet detection for quantification. The present method is simple and robust for fast high-throughput sample analysis with less analysis cost for analyzing aminophylline in biological samples. In this proposed method, no interfering peaks were observed at the elution times of aminophylline and the internal standard. The method also had sufficient selectivity, specificity, precision and accuracy over the concentration range of 0.5 - 40.0 µg/mL. An isocratic separation technique was used underlining the simplicity of the presented method.Keywords: Aminophyllin, preclinical pharmacokinetics, rat plasma, RPHPLC
Procedia PDF Downloads 22218255 Design Systems and the Need for a Usability Method: Assessing the Fitness of Components and Interaction Patterns in Design Systems Using Atmosphere Methodology
Authors: Patrik Johansson, Selina Mardh
Abstract:
The present study proposes a usability test method, Atmosphere, to assess the fitness of components and interaction patterns of design systems. The method covers the user’s perception of the components of the system, the efficiency of the logic of the interaction patterns, perceived ease of use as well as the user’s understanding of the intended outcome of interactions. These aspects are assessed by combining measures of first impression, visual affordance and expectancy. The method was applied to a design system developed for the design of an electronic health record system. The study was conducted involving 15 healthcare personnel. It could be concluded that the Atmosphere method provides tangible data that enable human-computer interaction practitioners to analyze and categorize components and patterns based on perceived usability, success rate of identifying interactive components and success rate of understanding components and interaction patterns intended outcome.Keywords: atomic design, atmosphere methodology, design system, expectancy testing, first impression testing, usability testing, visual affordance testing
Procedia PDF Downloads 180