Search results for: interval analysis method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40348

Search results for: interval analysis method

39148 Study on Preparation and Storage of Jam Incorporating Carrots (Dacus Carrota), Banana (Musa Acuminata) and Lime (Citrus Aurantifola)

Authors: K. Premakumar, D. S. Rushani, H. N. Hettiarachchi

Abstract:

The production and consumption of preserved foods have gained much importance due to globalization, and they provide a health benefit apart from the basic nutritional functions. Therefore, a study was conducted to develop a jam incorporating carrot, banana, and lime. Considering the findings of several preliminary studies, five formulations of the jam were prepared by blending different percentages of carrot and banana including control (where the only carrot was added). The freshly prepared formulations were subjected to physicochemical and sensory analysis.Physico-Chemical parameters such as pH, TSS, titrable acidity, ascorbic acid content, total sugar and non-reducing sugar and organoleptic qualities such as colour, aroma, taste, spread ability and overall acceptability and microbial analysis (total plate count) were analyzed after formulations. Physico-Chemical Analysis of the freshly prepared Carrot –Banana Blend jam showed increasing trend in titrable acidity (from 0.8 to 0.96, as % of citric acid), TSS (from 70.05 to 67.5 0Brix), ascorbic acid content (from 0.83 to 11.465 mg/100ml), reducing sugar (from 15.64 to 20.553%) with increase in carrot pulp from 50 to 100%. pH, total sugar, and non-reducing sugar were also reduced when carrot concentration is increased. Five points hedonic scale was used to evaluate the organoleptic characters. According to Duncan's Multiple Range Test, the mean scores for all the assessed sensory characters varied significantly (p<0.05) in the freshly made carrot-banana blend jam formulations. Based on the physicochemical and sensory analysis, the most preferred carrot: banana combinations of 50:50, 100:0 and 80:20 (T1, T2, and T5) were selected for storage studies.The formulations were stored at 300 °C room temperature and 70-75% of RH for 12 weeks. The physicochemical characteristics were measured at two weeks interval during storage. The decreasing trends in pH and ascorbic acid and an increasing trend in TSS, titrable acidity, total sugar, reducing sugar and non-reducing sugar were noted with advancement of storage periods of 12 weeks. The results of the chemical analysis showed that there were significance differences (p<0.05) between the tested formulations. Sensory evaluation was done for carrot –banana blends jam after a period of 12 weeks through a panel of 16 semi-trained panelists. The sensory analysis showed that there were significant differences (p<0.05) for organoleptic characters between carrot-banana blend jam formulations. The highest overall acceptability was observed in formulation with 80% carrot and 20% banana pulp. Microbiological Analysis was carried out on the day of preparation, 1 month, 2 months and 3 months after preparation. No bacterial growth was observed in the freshly made carrot -banana blend jam. There were no counts of yeast and moulds and coliforms in all treatments after the heat treatments and during the storage period. Only the bacterial counts (Total Plate Counts) were observed after three months of storage below the critical level, and all formulations were microbiologically safe for consumption. Based on the results of physio-chemical characteristics, sensory attributes, and microbial test, the carrot –banana blend jam with 80% carrot and 20% banana (T2) was selected as best formulation and could be stored up to 12 weeks without any significant changes in the quality characteristics.

Keywords: formulations, physicochemical parameters, microbiological analysis, sensory evaluation

Procedia PDF Downloads 202
39147 Sound Analysis of Young Broilers Reared under Different Stocking Densities in Intensive Poultry Farming

Authors: Xiaoyang Zhao, Kaiying Wang

Abstract:

The choice of stocking density in poultry farming is a potential way for determining welfare level of poultry. However, it is difficult to measure stocking densities in poultry farming because of a lot of variables such as species, age and weight, feeding way, house structure and geographical location in different broiler houses. A method was proposed in this paper to measure the differences of young broilers reared under different stocking densities by sound analysis. Vocalisations of broilers were recorded and analysed under different stocking densities to identify the relationship between sounds and stocking densities. Recordings were made continuously for three-week-old chickens in order to evaluate the variation of sounds emitted by the animals at the beginning. The experimental trial was carried out in an indoor reared broiler farm; the audio recording procedures lasted for 5 days. Broilers were divided into 5 groups, stocking density treatments were 8/m², 10/m², 12/m² (96birds/pen), 14/m² and 16/m², all conditions including ventilation and feed conditions were kept same except from stocking densities in every group. The recordings and analysis of sounds of chickens were made noninvasively. Sound recordings were manually analysed and labelled using sound analysis software: GoldWave Digital Audio Editor. After sound acquisition process, the Mel Frequency Cepstrum Coefficients (MFCC) was extracted from sound data, and the Support Vector Machine (SVM) was used as an early detector and classifier. This preliminary study, conducted in an indoor reared broiler farm shows that this method can be used to classify sounds of chickens under different densities economically (only a cheap microphone and recorder can be used), the classification accuracy is 85.7%. This method can predict the optimum stocking density of broilers with the complement of animal welfare indicators, animal productive indicators and so on.

Keywords: broiler, stocking density, poultry farming, sound monitoring, Mel Frequency Cepstrum Coefficients (MFCC), Support Vector Machine (SVM)

Procedia PDF Downloads 159
39146 A Study on the Safety Evaluation of Pier According to the Water Level Change by the Monte-Carlo Method

Authors: Minho Kwon, Jeonghee Lim, Yeongseok Jeong, Donghoon Shin, Kiyoung Kim

Abstract:

Recently, global warming phenomenon has led to natural disasters caused by global environmental changes, and due to abnormal weather events, the frequency and intensity of heavy rain storm typhoons are increasing. Therefore, it is imperative to prepare for future heavy rain storms and typhoons. This study selects arbitrary target bridges and performs numerical analysis to evaluate the safety of bridge piers in the event that the water level changes. The numerical model is based on two-dimensional surface elements. Actual reinforced concrete was simulated by modeling concrete to include reinforcements, and a contact boundary model was applied between the ground and the concrete. The water level applied to the piers was considered at 18 levels between 7.5 m and 16.1 m. The elastic modulus, compressive strength, tensile strength, and yield strength of the reinforced concrete were calculated using 250 random combinations and numerical analysis was carried out for each water level. In the results of analysis, the bridge exceeded the stated limit at 15.0 m. At the maximum water level of 16.1m, the concrete’s failure rate was 35.2%, but the probability that the reinforcement would fail was 61.2%.

Keywords: Monte-Carlo method, pier, water level change, limit state

Procedia PDF Downloads 285
39145 Globalization and Women's Social Identity in Iran: A Case Study of Educated Women in the 'World City' of Yazd

Authors: Mohammad Tefagh

Abstract:

The process of globalization has transformed many social and cultural phenomena and has entered the world into a new era and arena. This phenomenon has introduced new methods, ideas, and identity interactions to human beings and has caused great changes in individual and social identity. Women have also been affected by globalization. Globalization has made the presence of women more and more effective and has caused identity changes and changes in the dimensions of identity in them. The purpose of this study is to investigate the impact of globalization of culture on changes in the social identity of educated women in the global city of Yazd. This study will discuss identity change and identity reconstruction due to globalization. The method of this study is qualitative, and the research data is obtained through in-depth interviews with 15 Yazdi-educated women at the Ph.D. level. The method of data analysis is thematic analysis. Findings of the research show that educated Yazdi women have changed their identity due to new communication processes and globalization, including faster, easier, and cheaper communication with other women in the world near and far. Women's social identity has also changed in the face of elements of globalization in various dimensions such as national, gender, religious, and group identities. The analysis of the interviews revealed the confronting elements such as using new cultural goods and communication technologies, membership in social networks, and increasing awareness of environmental change.

Keywords: globalization, social identity, educated women, Yazd

Procedia PDF Downloads 331
39144 Genetic Structure Analysis through Pedigree Information in a Closed Herd of the New Zealand White Rabbits

Authors: M. Sakthivel, A. Devaki, D. Balasubramanyam, P. Kumarasamy, A. Raja, R. Anilkumar, H. Gopi

Abstract:

The New Zealand White breed of rabbit is one of the most commonly used, well adapted exotic breeds in India. Earlier studies were limited only to analyze the environmental factors affecting the growth and reproductive performance. In the present study, the population of the New Zealand White rabbits in a closed herd was evaluated for its genetic structure. Data on pedigree information (n=2508) for 18 years (1995-2012) were utilized for the study. Pedigree analysis and the estimates of population genetic parameters based on gene origin probabilities were performed using the software program ENDOG (version 4.8). The analysis revealed that the mean values of generation interval, coefficients of inbreeding and equivalent inbreeding were 1.489 years, 13.233 percent and 17.585 percent, respectively. The proportion of population inbred was 100 percent. The estimated mean values of average relatedness and the individual increase in inbreeding were 22.727 and 3.004 percent, respectively. The percent increase in inbreeding over generations was 1.94, 3.06 and 3.98 estimated through maximum generations, equivalent generations, and complete generations, respectively. The number of ancestors contributing the most of 50% genes (fₐ₅₀) to the gene pool of reference population was 4 which might have led to the reduction in genetic variability and increased amount of inbreeding. The extent of genetic bottleneck assessed by calculating the effective number of founders (fₑ) and the effective number of ancestors (fₐ), as expressed by the fₑ/fₐ ratio was 1.1 which is indicative of the absence of stringent bottlenecks. Up to 5th generation, 71.29 percent pedigree was complete reflecting the well-maintained pedigree records. The maximum known generations were 15 with an average of 7.9 and the average equivalent generations traced were 5.6 indicating of a fairly good depth in pedigree. The realized effective population size was 14.93 which is very critical, and with the increasing trend of inbreeding, the situation has been assessed to be worse in future. The proportion of animals with the genetic conservation index (GCI) greater than 9 was 39.10 percent which can be used as a scale to use such animals with higher GCI to maintain balanced contribution from the founders. From the study, it was evident that the herd was completely inbred with very high inbreeding coefficient and the effective population size was critical. Recommendations were made to reduce the probability of deleterious effects of inbreeding and to improve the genetic variability in the herd. The present study can help in carrying out similar studies to meet the demand for animal protein in developing countries.

Keywords: effective population size, genetic structure, pedigree analysis, rabbit genetics

Procedia PDF Downloads 292
39143 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis

Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu

Abstract:

Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.

Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding

Procedia PDF Downloads 166
39142 Environmental Radioactivity Analysis by a Sequential Approach

Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab

Abstract:

Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.

Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method

Procedia PDF Downloads 495
39141 A Static and Dynamic Slope Stability Analysis of Sonapur

Authors: Rupam Saikia, Ashim Kanti Dey

Abstract:

Sonapur is an intense hilly region on the border of Assam and Meghalaya lying in North-East India and is very near to a seismic fault named as Dauki besides which makes the region seismically active. Besides, these recently two earthquakes of magnitude 6.7 and 6.9 have struck North-East India in January and April 2016. Also, the slope concerned for this study is adjacent to NH 44 which for a long time has been a sole important connecting link to the states of Manipur and Mizoram along with some parts of Assam and so has been a cause of considerable loss to life and property since past decades as there has been several recorded incidents of landslide, road-blocks, etc. mostly during the rainy season which comes into news. Based on this issue this paper reports a static and dynamic slope stability analysis of Sonapur which has been carried out in MIDAS GTS NX. The slope being highly unreachable due to terrain and thick vegetation in-situ test was not feasible considering the current scope available so disturbed soil sample was collected from the site for the determination of strength parameters. The strength parameters were so determined for varying relative density with further variation in water content. The slopes were analyzed considering plane strain condition for three slope heights of 5 m, 10 m and 20 m which were then further categorized based on slope angles 30, 40, 50, 60, and 70 considering the possible extent of steepness. Initially static analysis under dry state was performed then considering the worst case that can develop during rainy season the slopes were analyzed for fully saturated condition along with partial degree of saturation with an increase in the waterfront. Furthermore, dynamic analysis was performed considering the El-Centro Earthquake which had a magnitude of 6.7 and peak ground acceleration of 0.3569g at 2.14 sec for the slope which were found to be safe during static analysis under both dry and fully saturated condition. Some of the conclusions were slopes with inclination above 40 onwards were found to be highly vulnerable for slopes of height 10 m and above even under dry static condition. Maximum horizontal displacement showed an exponential increase with an increase in inclination from 30 to 70. The vulnerability of the slopes was seen to be further increased during rainy season as even slopes of minimal steepness of 30 for height 20 m was seen to be on the verge of failure. Also, during dynamic analysis slopes safe during static analysis were found to be highly vulnerable. Lastly, as a part of the study a comparative study on Strength Reduction Method (SRM) versus Limit Equilibrium Method (LEM) was also carried out and some of the advantages and disadvantages were figured out.

Keywords: dynamic analysis, factor of safety, slope stability, strength reduction method

Procedia PDF Downloads 259
39140 Numerical Modelling of Laminated Shells Made of Functionally Graded Elastic and Piezoelectric Materials

Authors: Gennady M. Kulikov, Svetlana V. Plotnikova

Abstract:

This paper focuses on implementation of the sampling surfaces (SaS) method for the three-dimensional (3D) stress analysis of functionally graded (FG) laminated elastic and piezoelectric shells. The SaS formulation is based on choosing inside the nth layer In not equally spaced SaS parallel to the middle surface of the shell in order to introduce the electric potentials and displacements of these surfaces as basic shell variables. Such choice of unknowns permits the presentation of the proposed FG piezoelectric shell formulation in a very compact form. The SaS are located inside each layer at Chebyshev polynomial nodes that improves the convergence of the SaS method significantly. As a result, the SaS formulation can be applied efficiently to 3D solutions for FG piezoelectric laminated shells, which asymptotically approach the exact solutions of piezoelectricity as the number of SaS In goes to infinity.

Keywords: electroelasticity, functionally graded material, laminated piezoelectric shell, sampling surfaces method

Procedia PDF Downloads 687
39139 Dynamic Analysis of Double Deck Tunnel

Authors: C. W. Kwak, I. J. Park, D. I. Jang

Abstract:

The importance of cost-wise effective application and construction is getting increase due to the surge of traffic volume in the metropolitan cities. Accordingly, the necessity of the tunnel has large section becomes more critical. Double deck tunnel can be one of the most appropriate solutions to the necessity. The dynamic stability of double deck tunnel is essential against seismic load since it has large section and connection between perimeter lining and interim slab. In this study, 3-dimensional dynamic numerical analysis was conducted based on the Finite Difference Method to investigate the seismic behavior of double deck tunnel. Seismic joint for dynamic stability and the mitigation of seismic impact on the lining was considered in the modeling and analysis. Consequently, the mitigation of acceleration, lining displacement and stress were verified successfully.

Keywords: double deck tunnel, interim slab, 3-dimensional dynamic numerical analysis, seismic joint

Procedia PDF Downloads 379
39138 Numerical Solution of Integral Equations by Using Discrete GHM Multiwavelet

Authors: Archit Yajnik, Rustam Ali

Abstract:

In this paper, numerical method based on discrete GHM multiwavelets is presented for solving the Fredholm integral equations of second kind. There is hardly any article available in the literature in which the integral equations are numerically solved using discrete GHM multiwavelet. A number of examples are demonstrated to justify the applicability of the method. In GHM multiwavelets, the values of scaling and wavelet functions are calculated only at t = 0, 0.5 and 1. The numerical solution obtained by the present approach is compared with the traditional Quadrature method. It is observed that the present approach is more accurate and computationally efficient as compared to quadrature method.

Keywords: GHM multiwavelet, fredholm integral equations, quadrature method, function approximation

Procedia PDF Downloads 460
39137 The Analysis of Drill Bit Optimization by the Application of New Electric Impulse Technology in Shallow Water Absheron Peninsula

Authors: Ayshan Gurbanova

Abstract:

Despite based on the fact that drill bit which is the smallest part of bottom hole assembly costs only in between 10% and 15% of the total expenses made, they are the first equipment that is in contact with the formation itself. Hence, it is consequential to choose the appropriate type and dimension of drilling bit, which will prevent majority of problems by not demanding many tripping procedure. However, within the advance in technology, it is now seamless to be beneficial in the terms of many concepts such as subsequent time of operation, energy, expenditure, power and so forth. With the intention of applying the method to Azerbaijan, the field of Shallow Water Absheron Peninsula has been suggested, where the mainland has been located 15 km away from the wildcat wells, named as “NKX01”. It has the water depth of 22 m as indicated. In 2015 and 2016, the seismic survey analysis of 2D and 3D have been conducted in contract area as well as onshore shallow water depth locations. With the aim of indicating clear elucidation, soil stability, possible submersible dangerous scenarios, geohazards and bathymetry surveys have been carried out as well. Within the seismic analysis results, the exact location of exploration wells have been determined and along with this, the correct measurement decisions have been made to divide the land into three productive zones. In the term of the method, Electric Impulse Technology (EIT) is based on discharge energies of electricity within the corrosivity in rock. Take it simply, the highest value of voltages could be created in the less range of nano time, where it is sent to the rock through electrodes’ baring as demonstrated below. These electrodes- higher voltage powered and grounded are placed on the formation which could be obscured in liquid. With the design, it is more seamless to drill horizontal well based on the advantage of loose contact of formation. There is also no chance of worn ability as there are no combustion, mechanical power exist. In the case of energy, the usage of conventional drilling accounts for 1000 𝐽/𝑐𝑚3 , where this value accounts for between 100 and 200 𝐽/𝑐𝑚3 in EIT. Last but not the least, from the test analysis, it has been yielded that it achieves the value of ROP more than 2 𝑚/ℎ𝑟 throughout 15 days. Taking everything into consideration, it is such a fact that with the comparison of data analysis, this method is highly applicable to the fields of Azerbaijan.

Keywords: drilling, drill bit cost, efficiency, cost

Procedia PDF Downloads 72
39136 Automatic Segmentation of Lung Pleura Based On Curvature Analysis

Authors: Sasidhar B., Bhaskar Rao N., Ramesh Babu D. R., Ravi Shankar M.

Abstract:

Segmentation of lung pleura is a preprocessing step in Computer-Aided Diagnosis (CAD) which helps in reducing false positives in detection of lung cancer. The existing methods fail in extraction of lung regions with the nodules at the pleura of the lungs. In this paper, a new method is proposed which segments lung regions with nodules at the pleura of the lungs based on curvature analysis and morphological operators. The proposed algorithm is tested on 06 patient’s dataset which consists of 60 images of Lung Image Database Consortium (LIDC) and the results are found to be satisfactory with 98.3% average overlap measure (AΩ).

Keywords: curvature analysis, image segmentation, morphological operators, thresholding

Procedia PDF Downloads 594
39135 Proposal of a Model Supporting Decision-Making Based on Multi-Objective Optimization Analysis on Information Security Risk Treatment

Authors: Ritsuko Kawasaki (Aiba), Takeshi Hiromatsu

Abstract:

Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Moreover, risks generally have trends and it also should be considered in risk treatment. Therefore, this paper provides the extension of the model proposed in the previous study. The original model supports the selection of measures by applying a combination of weighted average method and goal programming method for multi-objective analysis to find an optimal solution. The extended model includes the notion of weights to the risks, and the larger weight means the priority of the risk.

Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization

Procedia PDF Downloads 461
39134 On the System of Split Equilibrium and Fixed Point Problems in Real Hilbert Spaces

Authors: Francis O. Nwawuru, Jeremiah N. Ezeora

Abstract:

In this paper, a new algorithm for solving the system of split equilibrium and fixed point problems in real Hilbert spaces is considered. The equilibrium bifunction involves a nite family of pseudo-monotone mappings, which is an improvement over monotone operators. More so, it turns out that the solution of the finite family of nonexpansive mappings. The regularized parameters do not depend on Lipschitz constants. Also, the computations of the stepsize, which plays a crucial role in the convergence analysis of the proposed method, do require prior knowledge of the norm of the involved bounded linear map. Furthermore, to speed up the rate of convergence, an inertial term technique is introduced in the proposed method. Under standard assumptions on the operators and the control sequences, using a modified Halpern iteration method, we establish strong convergence, a desired result in applications. Finally, the proposed scheme is applied to solve some optimization problems. The result obtained improves numerous results announced earlier in this direction.

Keywords: equilibrium, Hilbert spaces, fixed point, nonexpansive mapping, extragradient method, regularized equilibrium

Procedia PDF Downloads 47
39133 Analysis of Splicing Methods for High Speed Automated Fibre Placement Applications

Authors: Phillip Kearney, Constantina Lekakou, Stephen Belcher, Alessandro Sordon

Abstract:

The focus in the automotive industry is to reduce human operator and machine interaction, so manufacturing becomes more automated and safer. The aim is to lower part cost and construction time as well as defects in the parts, sometimes occurring due to the physical limitations of human operators. A move to automate the layup of reinforcement material in composites manufacturing has resulted in the use of tapes that are placed in position by a robotic deposition head, also described as Automated Fibre Placement (AFP). The process of AFP is limited with respect to the finite amount of material that can be loaded into the machine at any one time. Joining two batches of tape material together involves a splice to secure the ends of the finishing tape to the starting edge of the new tape. The splicing method of choice for the majority of prepreg applications is a hand stich method, and as the name suggests requires human input to achieve. This investigation explores three methods for automated splicing, namely, adhesive, binding and stitching. The adhesive technique uses an additional adhesive placed on the tape ends to be joined. Binding uses the binding agent that is already impregnated onto the tape through the application of heat. The stitching method is used as a baseline to compare the new splicing methods to the traditional technique currently in use. As the methods will be used within a High Speed Automated Fibre Placement (HSAFP) process, this meant the parameters of the splices have to meet certain specifications: (a) the splice must be able to endure a load of 50 N in tension applied at a rate of 1 mm/s; (b) the splice must be created in less than 6 seconds, dictated by the capacity of the tape accumulator within the system. The samples for experimentation were manufactured with controlled overlaps, alignment and splicing parameters, these were then tested in tension using a tensile testing machine. Initial analysis explored the use of the impregnated binding agent present on the tape, as in the binding splicing technique. It analysed the effect of temperature and overlap on the strength of the splice. It was found that the optimum splicing temperature was at the higher end of the activation range of the binding agent, 100 °C. The optimum overlap was found to be 25 mm; it was found that there was no improvement in bond strength from 25 mm to 30 mm overlap. The final analysis compared the different splicing methods to the baseline of a stitched bond. It was found that the addition of an adhesive was the best splicing method, achieving a maximum load of over 500 N compared to the 26 N load achieved by a stitching splice and 94 N by the binding method.

Keywords: analysis, automated fibre placement, high speed, splicing

Procedia PDF Downloads 153
39132 Burnback Analysis of Star Grain Using Level-Set Technique

Authors: Ali Yasin, Ali Kamran, Muhammad Safdar

Abstract:

In order to reduce the hefty cost involved in terms of time and project cost, the development and application of advanced numerical tools to address the burn-back analysis problem in solid rocket motor design and development is the need of time. Several advanced numerical schemes have been developed in recent times, but their usage in the design of propellant grain of solid rocket motors is very rare. In this paper, an advanced numerical technique named the Level-Set method has been utilized for the burn-back analysis of star grain to study the effect of geometrical parameters on ballistic performance indicators such as solid loading, neutrality, and sliver percentage. In the level set technique, simple finite difference methods may fail quickly and require more sophisticated non-oscillatory schemes for feasible long-time simulation. For internal ballistic calculations, a simplified equilibrium pressure method is utilized. Preliminary results of the operative conditions, for all the combustion time, of star grain burn-back using level set techniques are compared with published results using CAD technique to test the developed numerical model.

Keywords: solid rocket motor, internal ballistic, level-set technique, star grain

Procedia PDF Downloads 121
39131 Analysis of Bed Load Sediment Transport Mataram-Babarsari Irrigation Canal

Authors: Agatha Padma Laksitaningtyas, Sumiyati Gunawan

Abstract:

Mataram Irrigation Canal has 31,2 km length, is the main irrigation canal in Special Region Province of Yogyakarta, connecting Progo River on the west side and Opak River on the east side. It has an important role as the main water carrier distribution for various purposes such as agriculture, fishery, and plantation which should be free from sediment material. Bed Load Sediment is the basic sediment that will make the sediment process on the irrigation canal. Sediment process is a simultaneous event that can make deposition sediment at the base of irrigation canal and can make the height of elevation water change, it will affect the availability of water to be used for irrigation functions. To predict the amount of drowning sediments in the irrigation canal using two methods: Meyer-Peter and Muller’s Method which is an energy approach method and Einstein Method which is a probabilistic approach. Speed measurement using floating method and using current meters. The channel geometry is measured directly in the field. The basic sediment of the channel is taken in the field by taking three samples from three different points. The result of the research shows that by using the formula Meyer -Peter Muller get the result of 60,75799 kg/s, whereas with Einsten’s Method get result of 13,06461 kg/s. the results may serve as a reference for dredging the sediments on the channel so as not to disrupt the flow of water in irrigation canal.

Keywords: bed load, sediment, irrigation, Mataram canal

Procedia PDF Downloads 226
39130 A New Family of Globally Convergent Conjugate Gradient Methods

Authors: B. Sellami, Y. Laskri, M. Belloufi

Abstract:

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, a new family of conjugate gradient method is proposed for unconstrained optimization. This method includes the already existing two practical nonlinear conjugate gradient methods, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the Wolfe conditions. The numerical experiments are done to test the efficiency of the new method, which implies the new method is promising. In addition the methods related to this family are uniformly discussed.

Keywords: conjugate gradient method, global convergence, line search, unconstrained optimization

Procedia PDF Downloads 408
39129 Life Cycle-Based Analysis of Meat Production: Ecosystem Impacts

Authors: Michelle Zeyuan Ma, Hermann Heilmeier

Abstract:

Recently, meat production ecosystem impacts initiated many hot discussions and researchers, and it is a difficult implementation to reduce such impacts due to the demand of meat products. It calls for better management and control of ecosystem impacts from every aspects of meat production. This article analyzes the ecosystem impacts of meat production based on meat products life cycle. The analysis shows that considerable ecosystem impacts are caused by different meat production steps: initial establishment phase, animal raising, slaughterhouse processing, meat consumption, and wastes management. Based on this analysis, the impacts are summarized as: leading factor for biodiversity loss; water waste, land use waste and land degradation; greenhouse gases emissions; pollution to air, water, and soil; related major diseases. The article also provides a discussion on a solution-sustainable food system, which could help in reducing ecosystem impacts. The analysis method is based on the life cycle level, it provides a concept of the whole meat industry ecosystem impacts, and the analysis result could be useful to manage or control meat production ecosystem impacts from investor, producer and consumer sides.

Keywords: eutrophication, life cycle based analysis, sustainable food, waste management

Procedia PDF Downloads 218
39128 Performance Analysis of Encased Sand Columns in Different Clayey Soils Using 3D Numerical Method

Authors: Enayatallah Najari, Ali Noorzad, Mehdi Siavoshnia

Abstract:

One of the most decent and low-cost options in soft clayey soil improvement is using stone columns to reduce the settlement and increase the bearing capacity which is used for different ways to do this in various projects with diverse conditions. In the current study, it is tried to evaluate this improvement method in 4 different weak soils with diverse properties like specific gravity, permeability coefficient, over consolidation ratio (OCR), poison’s ratio, internal friction angle and bulk modulus by using ABAQUS 3D finite element software. Increment and decrement impacts of each mentioned factor on settlement and lateral displacement of weak soil beds are analyzed. In analyzed models, the properties related to sand columns and geosynthetic cover are assumed to be constant with their optimum values, and just soft clayey soil parameters are considered to be variable. It’s also demonstrated that OCR value can play a determinant role in soil resistance.

Keywords: stone columns, geosynthetic, finite element, 3D analysis, soft soils

Procedia PDF Downloads 360
39127 A Comparison Study of Different Methods Used in the Detection of Giardia lamblia on Fecal Specimen of Children

Authors: Muhammad Farooq Baig

Abstract:

Objective: The purpose of this study was to compare results obtained using a single fecal specimen for O&P examination, direct immunofluorescence assay (DFA), and two conventional staining methods. Design: Hundred and fifty children fecal specimens were collected and examined by each method. The O&P and the DFA were used as the reference method. Setting: The study was performed at the laboratory in the Basic Medical Science Institute JPMC Karachi. Patients or Other Participants: The fecal specimens were collected from children with a suspected Giardia lamblia infection. Main Outcome Measures: The amount of agreement and disagreement between methods.1) Presence of giardiasis in our population. 2) The sensitivity and specificity of each method. Results: There was 45(30%) positive 105 (70%) negative on DFA, 41 (27.4%) positive 109 (72.6%) negative on iodine and 34 (22.6%) positive 116(77.4%) on saline method. The sensitivity and specificity of DFA in comparision to iodine were 92.2%, 92.7% respectively. The sensitivity and specificity of DFA in comparisoin to saline method were 91.2%, 87.9% respectively. The sensitivity of iodine method and saline method in compariosn to DFA were 82.2%, 68.8% respectively. There is mark diffrence in sensitivity of DFA to conventional method. Conclusion: The study supported findings of other investigators who concluded that DFA method have the greater sensitivity. The immunologic methods were more efficient and quicker than the conventional O&P method.

Keywords: direct immunofluorescence assay (DFA), ova and parasite (O&P), Giardia lamblia, children, medical science

Procedia PDF Downloads 418
39126 The Effectiveness of Video Clips to Enhance Students’ Achievement and Motivation on History Learning and Facilitation

Authors: L. Bih Ni, D. Norizah Ag Kiflee, T. Choon Keong, R. Talip, S. Singh Bikar Singh, M. Noor Mad Japuni, R. Talin

Abstract:

The purpose of this study is to determine the effectiveness of video clips to enhance students' achievement and motivation towards learning and facilitating of history. We use narrative literature studies to illustrate the current state of the two art and science in focused areas of inquiry. We used experimental method. The experimental method is a systematic scientific research method in which the researchers manipulate one or more variables to control and measure any changes in other variables. For this purpose, two experimental groups have been designed: one experimental and one groups consisting of 30 lower secondary students. The session is given to the first batch using a computer presentation program that uses video clips to be considered as experimental group, while the second group is assigned as the same class using traditional methods using dialogue and discussion techniques that are considered a control group. Both groups are subject to pre and post-trial in matters that are handled by the class. The findings show that the results of the pre-test analysis did not show statistically significant differences, which in turn proved the equality of the two groups. Meanwhile, post-test analysis results show that there was a statistically significant difference between the experimental group and the control group at an importance level of 0.05 for the benefit of the experimental group.

Keywords: Video clips, Learning and Facilitation, Achievement, Motivation

Procedia PDF Downloads 151
39125 Determination of Verapamil Hydrochloride in Tablets and Injection Solutions With the Verapamil-Selective Electrode and Possibilities of Application in Pharmaceutical Analysis

Authors: Faisal A. Salih

Abstract:

Verapamil hydrochloride (Ver) is a drug used in medicine for arrythmia, angina and hypertension as a calcium channel blocker. For the quantitative determination of Ver in dosage forms, the HPLC method is most often used. A convenient alternative to the chromatographic method is potentiometry using a Verselective electrode, which does not require expensive equipment, can be used without separation from the matrix components, which significantly reduces the analysis time, and does not use toxic organic solvents, being a "green", "environmentally friendly" technique. It has been established in this study that the rational choice of the membrane plasticizer and the preconditioning and measurement algorithms, which prevent nonexchangeable extraction of Ver into the membrane phase, makes it possible to achieve excellent analytical characteristics of Ver-selective electrodes based on commercially available components. In particular, an electrode with the following membrane composition: PVC (32.8 wt %), ortho-nitrophenyloctyl ether (66.6 wt %), and tetrakis-4-chlorophenylborate (0.6 wt % or 0.01 M) have the lower detection limit 4 × 10−8 M and potential reproducibility 0.15–0.22 mV. Both direct potentiometry (DP) and potentiometric titration (PT) methods can be used for the determination of Ver in tablets and injection solutions. Masses of Ver per average tablet weight determined by the methods of DP and PT for the same set of 10 tablets were (80.4±0.2 and80.7±0.2) mg, respectively. The masses of Ver in solutions for injection, determined by DP for two ampoules from one set, were (5.00±0.015 and 5.004±0.006) mg. In all cases, good reproducibility and excellent correspondence with the declared quantities were observed.

Keywords: verapamil, potentiometry, ion-selective electrode, pharmaceutical analysis

Procedia PDF Downloads 86
39124 The Effects of Time and Cyclic Loading to the Axial Capacity for Offshore Pile in Shallow Gas

Authors: Christian H. Girsang, M. Razi B. Mansoor, Noorizal N. Huang

Abstract:

An offshore platform was installed in 1977 at about 260km offshore West Malaysia at the water depth of 73.6m. Twelve (12) piles were installed with four (4) are skirt piles. The piles have 1.219m outside diameter and wall thickness of 31mm and were driven to 109m below seabed. Deterministic analyses of the pile capacity under axial loading were conducted using the current API (American Petroleum Institute) method and the four (4) CPT-based methods: the ICP (Imperial College Pile)-method, the NGI (Norwegian Geotechnical Institute)-Method, the UWA (University of Western Australia)-method and the Fugro-method. A statistical analysis of the model uncertainty associated with each pile capacity method was performed. There were two (2) piles analysed: Pile 1 and piles other than Pile 1, where Pile 1 is the pile that was most affected by shallow gas problems. Using the mean estimate of soil properties, the five (5) methods used for deterministic estimation of axial pile capacity in compression predict an axial capacity from 28 to 42MN for Pile 1 and 32 to 49MN for piles other than Pile 1. These values refer to the static capacity shortly after pile installation. They do not include the effects of cyclic loading during the design storm or time after installation on the axial pile capacity. On average, the axial pile capacity is expected to have increased by about 40% because of ageing since the installation of the platform in 1977. On the other hand, the cyclic loading effects during the design storm may reduce the axial capacity of the piles by around 25%. The study concluded that all piles have sufficient safety factor when the pile aging and cyclic loading effect are considered, as all safety factors are above 2.0 for maximum operating and storm loads.

Keywords: axial capacity, cyclic loading, pile ageing, shallow gas

Procedia PDF Downloads 342
39123 The Development of Liquid Chromatography Tandem Mass Spectrometry Method for Citrinin Determination in Dry-Fermented Meat Products

Authors: Ana Vulic, Tina Lesic, Nina Kudumija, Maja Kis, Manuela Zadravec, Nada Vahcic, Tomaz Polak, Jelka Pleadin

Abstract:

Mycotoxins are toxic secondary metabolites produced by numerous types of molds. They can contaminate both food and feed so that they represent a serious public health concern. Production of dry-fermented meat products involves ripening, during which molds can overgrow the product surface, produce mycotoxins, and consequently contaminate the final product. Citrinin is a mycotoxin produced mainly by the Penicillium citrinum. Data on citrinin occurrence in both food and feed are limited. Therefore, there is a need for research on citrinin occurrence in these types of meat products. The LC-MS/MS method for citrinin determination was developed and validated. Sample preparation was performed using immunoaffinity columns, which resulted in clean sample extracts. Method validation included the determination of the limit of detection (LOD), the limit of quantification (LOQ), recovery, linearity, and matrix effect in accordance to the latest validation guidance. The determined LOD and LOQ were 0.60 µg/kg and 1.98 µg/kg, respectively, showing a good method sensitivity. The method was tested for its linearity in the calibration range of 1 µg/L to 10 µg/L. The recovery was 100.9 %, while the matrix effect was 0.7 %. This method was employed in the analysis of 47 samples of dry-fermented sausages collected from local households. Citrinin wasn’t detected in any of these samples, probably because of the short ripening period of the tested sausages that takes three months tops. The developed method shall be used to test other types of traditional dry-cured products, such as prosciuttos, whose surface is usually more heavily overgrown by surface molds due to the longer ripening period.

Keywords: citrinin, dry-fermented meat products, LC-MS/MS, mycotoxins

Procedia PDF Downloads 118
39122 Prompt Design for Code Generation in Data Analysis Using Large Language Models

Authors: Lu Song Ma Li Zhi

Abstract:

With the rapid advancement of artificial intelligence technology, large language models (LLMs) have become a milestone in the field of natural language processing, demonstrating remarkable capabilities in semantic understanding, intelligent question answering, and text generation. These models are gradually penetrating various industries, particularly showcasing significant application potential in the data analysis domain. However, retraining or fine-tuning these models requires substantial computational resources and ample downstream task datasets, which poses a significant challenge for many enterprises and research institutions. Without modifying the internal parameters of the large models, prompt engineering techniques can rapidly adapt these models to new domains. This paper proposes a prompt design strategy aimed at leveraging the capabilities of large language models to automate the generation of data analysis code. By carefully designing prompts, data analysis requirements can be described in natural language, which the large language model can then understand and convert into executable data analysis code, thereby greatly enhancing the efficiency and convenience of data analysis. This strategy not only lowers the threshold for using large models but also significantly improves the accuracy and efficiency of data analysis. Our approach includes requirements for the precision of natural language descriptions, coverage of diverse data analysis needs, and mechanisms for immediate feedback and adjustment. Experimental results show that with this prompt design strategy, large language models perform exceptionally well in multiple data analysis tasks, generating high-quality code and significantly shortening the data analysis cycle. This method provides an efficient and convenient tool for the data analysis field and demonstrates the enormous potential of large language models in practical applications.

Keywords: large language models, prompt design, data analysis, code generation

Procedia PDF Downloads 37
39121 A Study on the Optimum Shoulder Width in the Tunnel Considering Driving Safety

Authors: Somyoung Shin, Donghun Jeong, Yeoil Yun

Abstract:

South Korea continuously installed tunnels in consideration of the safety and operation efficiency, and the number of installed tunnels has doubled over the past ten years. The tunnel section is designed based on the guidelines, but the tunnel entrance becomes narrow due to dark adaptation and pressure. In fact, around 13% of traffic in expressways of Japan happens at the entrance, leading to congestion and rear-ends collision accidents. Therefore, this study aims to analyze the stability from the expansion of the shoulder width in the tunnel entrance by applying a virtual reality driving simulator in order to reduce the accidents that happen in the tunnel entrance. To compare the driving stability based on the changes in the width of the right shoulder under the same condition, a virtual reality driving simulator is used to conduct an experiment on 30 subjects in their 20s to 60s and to provide a more practical virtual reality driving environment, and an experiment map is designed based on actual roads as the background to conduct the experiment. The right shoulder is classified into 2.5m and 3.0m based on the design guidelines of the expressways and the road structure installation regulations. The experimenters' experiment order is decided randomly. As a result of analyzing the average speed, it was displayed as 100.73km/h when the shoulder width was 2.5m and 101.69km/h when the shoulder width was 3.0m and as a result of conducting t-test analysis, the p-value appeared as more than 0.05 in the significance level of 95%, so it was statistically insignificant. Also, as a result of analyzing the speed deviation between the average driving speed of the analyzed interval and the average driving speed upon entering the tunnel, it was displayed as 3.06km/h when the shoulder width was 2.5m and 1.87km/h when the shoulder width was 3.0m and as a result of conducting t-test analysis, the p-value appeared as less than 0.05 in the significance level of 95%, so it was statistically significant. This means that when the shoulder width is 3.0m, there is stability in terms of the driving stability compared to when it is 2.5m. Therefore, it is considered that when new roads are constructed in Korea, the right shoulder width should be installed as 3.0m to enhance the driving stability.

Keywords: driving stability, shoulder width, tunnel, virtual reality driving simulator

Procedia PDF Downloads 196
39120 Concept Analysis of Professionalism in Teachers and Faculty Members

Authors: Taiebe Shokri, Shahram Yazdani, Leila Afshar, Soleiman Ahmadi

Abstract:

Introduction: The importance of professionalism in higher education not only determines the appropriate and inappropriate behaviors and guides faculty members in the implementation of professional responsibilities, but also guarantees faculty members' adherence to professional principles and values, ensures the quality of teaching and facilitator will be the teaching-learning process in universities and will increase the commitment to meet the needs of students as well as the development of an ethical culture based on ethics. Therefore, considering the important role of medical education teachers to prepare teachers and students in the future, the need to determine the concept of professional teacher and teacher, and the characteristics of teacher professionalism, we have explained the concept of professionalism in teachers in this study. Methods: The concept analysis method used in this study was Walker and Avant method which has eight steps. Walker and Avant state the purpose of concept analysis as follows: The process of distinguishing between the defining features of a concept and its unrelated features. The process of concept analysis includes selecting a concept, determining the purpose of the analysis, identifying the uses of the concept, determining the defining features of the concept, identifying a model, identifying boundary and adversarial items, identifying the precedents and consequences of the concept, and defining empirical references. is. Results: Professionalism in its general sense, requires deep knowledge, insight, creating a healthy and safe environment, honesty and trust, impartiality, commitment to the profession and continuous improvement, punctuality, criticism, professional competence, responsibility, and Individual accountability, especially in social interactions, is an effort for continuous improvement, the acquisition of these characteristics is not easily possible and requires education, especially continuous learning. Professionalism is a set of values, behaviors, and relationships that underpin public trust in teachers.

Keywords: concept analysis, medical education, professionalism, faculty members

Procedia PDF Downloads 153
39119 An Introduction to Giulia Annalinda Neglia Viewpoint on Morphology of the Islamic City Using Written Content Analysis Approach

Authors: Mohammad Saber Eslamlou

Abstract:

Morphology of Islamic cities has been extensively studied by researchers of Islamic cities and different theories could be found about it. In this regard, there exist much difference in method of analysis, classification, recognition, confrontation and comparative method of urban morphology. The present paper aims to examine the previous methods, approaches and insights and that how Dr. Giulia Annalinda Neglia dealt with the analysis of morphology of Islamic cities. Neglia is assistant professor in University of Bari, Italy (UNIBA) who has published numerous papers and books on Islamic cities. I introduce her works in the field of morphology of Islamic cities. And then, her thoughts, insights and research methodologies are presented and analyzed in critical perspective. This is a qualitative research on her written works, which have been classified in three major categories. The first category consists mainly of her works on morphology and physical shape of Islamic cities. The results of her works’ review suggest that she has used Moratoria typology in investigating morphology of Islamic cities. Moreover, overall structure of the cities under investigation is often described linear; however, she’s against to define a single framework for the recognition of morphology in Islamic cities. She states that ‘to understand the physical complexity and irregularities in Islamic cities, it is necessary to study the urban fabric by typology method, focusing on transformation processes of the buildings’ form and their surrounding open spaces’ and she believes that fabric of each region in the city follows from the principles of an specific period or urban pattern, in particular, Hellenistic and Roman structures. Furthermore, she believes that it is impossible to understand the morphology of a city without taking into account the obvious and hidden developments associated with it, because form of building and their surrounding open spaces are written history of the city.

Keywords: city, Islamic city, Giulia Annalinda Neglia, morphology

Procedia PDF Downloads 95