Search results for: frequency analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29584

Search results for: frequency analysis

25114 A Sports-Specific Physiotherapy Center Treats Sports Injuries

Authors: Andrew Anis Fakhrey Mosaad

Abstract:

Introduction: Sports- and physical activity-related injuries may be more likely if there is a genetic predisposition, improper coaching and/or training, and no follow-up care from sports medicine. Goal: To evaluate the frequency of injuries among athletes receiving care at a sportsfocused physical therapy clinic. Methods: The survey of injuries in athletes' treatment records over a period of eight years of activity was done to obtain data. The data collected included: the patient's features, the sport, the type of injury, the injury's characteristics, and the body portion injured. Results: The athletes were drawn from 1090 patient/athlete records, had an average age of 25, participated in 44 different sports, and were 75% men on average. Joint injuries were the most frequent type of injury, then damage to the muscles and bones. The most prevalent type of injury was chronic (47%), while the knee, ankle, and shoulder were the most frequently damaged body parts. The most injured athletes were seen in soccer, futsal, and track and field, respectively, out of all the sports. Conclusion: The most popular sport among injured players was soccer, and the most common injury type was joint damage, with the knee being the most often damaged body area. The majority of the injuries were chronic.

Keywords: sports injuries, athletes, joint injuries, injured players

Procedia PDF Downloads 54
25113 Measurements of Environmental Pollution in Chemical Fertilizer Industrial Area Using Magnetic Susceptibility Method

Authors: Ramadhani Yasyfi Cysela, Adinda Syifa Azhari, Eleonora Agustine

Abstract:

The World Health Organization (WHO) estimates that about a quarter of the diseases facing mankind today occur due to environmental pollution. The soil is a part of environment that have a widespread problem. The contaminated soil should no longer be used to grow food because the chemicals can leech into the food and harm people who eat it. The chemical fertilizer industry gives specific effect due to soil pollution. To determine ammonia and urea emissions from fertilizer industry, we can use physical characteristic of soil, which is magnetic susceptibility. Rock magnetism is used as a proxy indicator to determine changes in physical properties. Magnetic susceptibilities of samples in low and high frequency have been measured by Bartington MS2B magnetic susceptibility measurement device. The sample was taken from different area which located closer by pollution source and far from the pollution source. The susceptibility values of polluted samples in topsoil were quite low, with range from 187.1- 494.8 [x 10-8 m3 kg-1] when free polluted area’s sample has high values (1188.7- 2237.8 [x 10-8 m3 kg-1 ]). From this studies shows that susceptibility values in areas of the fertilizer industry are lower than the free polluted area.

Keywords: environmental, magnetic susceptibility, rock magnetism, soil pollution

Procedia PDF Downloads 335
25112 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 415
25111 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application

Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro

Abstract:

This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.

Keywords: item response theory, dimensionality, submodel theory, factorial analysis

Procedia PDF Downloads 352
25110 Rapid and Sensitive Detection: Biosensors as an Innovative Analytical Tools

Authors: Sylwia Baluta, Joanna Cabaj, Karol Malecha

Abstract:

The evolution of biosensors was driven by the need for faster and more versatile analytical methods for application in important areas including clinical, diagnostics, food analysis or environmental monitoring, with minimum sample pretreatment. Rapid and sensitive neurotransmitters detection is extremely important in modern medicine. These compounds mainly occur in the brain and central nervous system of mammals. Any changes in the neurotransmitters concentration may lead to many diseases, such as Parkinson’s or schizophrenia. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements.

Keywords: adrenaline, biosensor, dopamine, laccase, tyrosinase

Procedia PDF Downloads 128
25109 Thermal Analysis and Optimization of a High-Speed Permanent Magnet Synchronous Motor with Toroidal Windings

Authors: Yuan Wan, Shumei Cui, Shaopeng Wu

Abstract:

Toroidal windings were taken advantage of to reduce of axial length of the motor, so as to match the applications that have severe restrictions on the axial length. But slotting in the out edge of the stator will decrease the heat-dissipation capacity of the water cooling of the housing. Besides, the windings in the outer slots will increase the copper loss, which will further increase the difficult for heat dissipation of the motor. At present, carbon-fiber composite retaining sleeve are increasingly used to be mounted over the magnets to ensure the rotor strength at high speeds. Due to the poor thermal conductivity of carbon-fiber sleeve, the cooling of the rotor becomes very difficult, which may result in the irreversible demagnetization of magnets for the excessively high temperature. So it is necessary to analyze the temperature rise of such motor. This paper builds a computational fluid dynamic (CFD) model of a toroidal-winding high-speed permanent magnet synchronous motor (PMSM) with water cooling of housing and forced air cooling of rotor. Thermal analysis was carried out based on the model and the factors that affects the temperature rise were investigated. Then thermal optimization for the prototype was achieved. Finally, a small-size prototype was manufactured and the thermal analysis results were verified.

Keywords: thermal analysis, temperature rise, toroidal windings, high-speed PMSM, CFD

Procedia PDF Downloads 479
25108 Reliability Analysis of Geometric Performance of Onboard Satellite Sensors: A Study on Location Accuracy

Authors: Ch. Sridevi, A. Chalapathi Rao, P. Srinivasulu

Abstract:

The location accuracy of data products is a critical parameter in assessing the geometric performance of satellite sensors. This study focuses on reliability analysis of onboard sensors to evaluate their performance in terms of location accuracy performance over time. The analysis utilizes field failure data and employs the weibull distribution to determine the reliability and in turn to understand the improvements or degradations over a period of time. The analysis begins by scrutinizing the location accuracy error which is the root mean square (RMS) error of differences between ground control point coordinates observed on the product and the map and identifying the failure data with reference to time. A significant challenge in this study is to thoroughly analyze the possibility of an infant mortality phase in the data. To address this, the Weibull distribution is utilized to determine if the data exhibits an infant stage or if it has transitioned into the operational phase. The shape parameter beta plays a crucial role in identifying this stage. Additionally, determining the exact start of the operational phase and the end of the infant stage poses another challenge as it is crucial to eliminate residual infant mortality or wear-out from the model, as it can significantly increase the total failure rate. To address this, an approach utilizing the well-established statistical Laplace test is applied to infer the behavior of sensors and to accurately ascertain the duration of different phases in the lifetime and the time required for stabilization. This approach also helps in understanding if the bathtub curve model, which accounts for the different phases in the lifetime of a product, is appropriate for the data and whether the thresholds for the infant period and wear-out phase are accurately estimated by validating the data in individual phases with Weibull distribution curve fitting analysis. Once the operational phase is determined, reliability is assessed using Weibull analysis. This analysis not only provides insights into the reliability of individual sensors with regards to location accuracy over the required period of time, but also establishes a model that can be applied to automate similar analyses for various sensors and parameters using field failure data. Furthermore, the identification of the best-performing sensor through this analysis serves as a benchmark for future missions and designs, ensuring continuous improvement in sensor performance and reliability. Overall, this study provides a methodology to accurately determine the duration of different phases in the life data of individual sensors. It enables an assessment of the time required for stabilization and provides insights into the reliability during the operational phase and the commencement of the wear-out phase. By employing this methodology, designers can make informed decisions regarding sensor performance with regards to location accuracy, contributing to enhanced accuracy in satellite-based applications.

Keywords: bathtub curve, geometric performance, Laplace test, location accuracy, reliability analysis, Weibull analysis

Procedia PDF Downloads 58
25107 Re-Analyzing Energy-Conscious Design

Authors: Svetlana Pushkar, Oleg Verbitsky

Abstract:

An energy-conscious design for a classroom in a hot-humid climate is reanalyzed. The hypothesis of this study is that use of photovoltaic (PV) electricity generation in building operation energy consumption will lead to re-analysis of the energy-conscious design. Therefore, the objective of this study is to reanalyze the energy-conscious design by evaluating the environmental impact of operational energy with PV electrical generation. Using the hierarchical design structure of Eco-indicator 99, the alternatives for energy-conscious variables are statistically evaluated by applying a two-stage nested (hierarchical) ANOVA. The recommendations for the preferred solutions for application of glazing types, wall insulation, roof insulation, window size, roof mass, and window shading design alternatives were changed (for example, glazing type recommendations were changed from low-emissivity glazing, green, and double- glazed windows to low-emissivity glazing only), whereas the applications for the lighting control system and infiltration are not changed. Such analysis of operational energy can be defined as environment-conscious analysis.

Keywords: ANOVA, Eco-Indicator 99, energy-conscious design, hot–humid climate, photovoltaic

Procedia PDF Downloads 170
25106 Urban Regeneration of Historic Paths: A Case Study of Kom El Dekka Historic Path

Authors: Ahmed R. Ismail, Hatem A. El Tawil, Nevin G. Rezk

Abstract:

Historic paths in today's cities are facing the pressure of the urban development due to the rapid urban growth. Every new development is tearing the old urban fabric and the socio-economic character of the historic paths. Furthermore, in some cases historic paths suffer from negligence and decay. Kom El Dekka historic path was one of those deteriorated paths in the city of Alexandria, Egypt, in spite of its high heritage and socio-economic value. Therefore, there was a need to develop urban regeneration strategies as a part of a wider sustainable development vision, to handle the situation and revitalize the path as a livable space in the heart of the city. This study aims to develop a comprehensive assessment methodology to evaluate the different values of the path and to create community-oriented and economic-based analysis methodology for its socio-economic values. These analysis and assessments provide strategies for any regeneration action plan for Kom El Dekka historic path.

Keywords: community-oriented, economic-based, syntactical analysis, urban regeneration

Procedia PDF Downloads 404
25105 Periodicity Analysis of Long-Term Waterquality Data Series of the Hungarian Section of the River Tisza Using Morlet Wavelet Spectrum Estimation

Authors: Péter Tanos, József Kovács, Angéla Anda, Gábor Várbíró, Sándor Molnár, István Gábor Hatvani

Abstract:

The River Tisza is the second largest river in Central Europe. In this study, Morlet wavelet spectrum (periodicity) analysis was used with chemical, biological and physical water quality data for the Hungarian section of the River Tisza. In the research 15, water quality parameters measured at 14 sampling sites in the River Tisza and 4 sampling sites in the main artificial changes were assessed for the time period 1993 - 2005. Results show that annual periodicity was not always to be found in the water quality parameters, at least at certain sampling sites. Periodicity was found to vary over space and time, but in general, an increase was observed in the company of higher trophic states of the river heading downstream.

Keywords: annual periodicity water quality, spatiotemporal variability of periodic behavior, Morlet wavelet spectrum analysis, River Tisza

Procedia PDF Downloads 331
25104 The Current Application of BIM - An Empirical Study Focusing on the BIM-Maturity Level

Authors: Matthias Stange

Abstract:

Building Information Modelling (BIM) is one of the most promising methods in the building design process and plays an important role in the digitalization of the Architectural, Engineering, and Construction (AEC) Industry. The application of BIM is seen as the key enabler for increasing productivity in the construction industry. The model-based collaboration using the BIM method is intended to significantly reduce cost increases, schedule delays, and quality problems in the planning and construction of buildings. Numerous qualitative studies based on expert interviews support this theory and report perceived benefits from the use of BIM in terms of achieving project objectives related to cost, schedule, and quality. However, there is a large research gap in analysing quantitative data collected from real construction projects regarding the actual benefits of applying BIM based on representative sample size and different application regions as well as different project typologies. In particular, the influence of the project-related BIM maturity level is completely unexplored. This research project examines primary data from 105 construction projects worldwide using quantitative research methods. Projects from the areas of residential, commercial, and industrial construction as well as infrastructure and hydraulic engineering were examined in application regions North America, Australia, Europe, Asia, MENA region, and South America. First, a descriptive data analysis of 6 independent project variables (BIM maturity level, application region, project category, project type, project size, and BIM level) were carried out using statistical methods. With the help of statisticaldata analyses, the influence of the project-related BIM maturity level on 6 dependent project variables (deviation in planning time, deviation in construction time, number of planning collisions, frequency of rework, number of RFIand number of changes) was investigated. The study revealed that most of the benefits of using BIM perceived through numerous qualitative studies have not been confirmed. The results of the examined sample show that the application of BIM did not have an improving influence on the dependent project variables, especially regarding the quality of the planning itself and the adherence to the schedule targets. The quantitative research suggests the conclusion that the BIM planning method in its current application has not (yet) become a recognizable increase in productivity within the planning and construction process. The empirical findings indicate that this is due to the overall low level of BIM maturity in the projects of the examined sample. As a quintessence, the author suggests that the further implementation of BIM should primarily focus on an application-oriented and consistent development of the project-related BIM maturity level instead of implementing BIM for its own sake. Apparently, there are still significant difficulties in the interweaving of people, processes, and technology.

Keywords: AEC-process, building information modeling, BIM maturity level, project results, productivity of the construction industry

Procedia PDF Downloads 61
25103 A Static and Dynamic Slope Stability Analysis of Sonapur

Authors: Rupam Saikia, Ashim Kanti Dey

Abstract:

Sonapur is an intense hilly region on the border of Assam and Meghalaya lying in North-East India and is very near to a seismic fault named as Dauki besides which makes the region seismically active. Besides, these recently two earthquakes of magnitude 6.7 and 6.9 have struck North-East India in January and April 2016. Also, the slope concerned for this study is adjacent to NH 44 which for a long time has been a sole important connecting link to the states of Manipur and Mizoram along with some parts of Assam and so has been a cause of considerable loss to life and property since past decades as there has been several recorded incidents of landslide, road-blocks, etc. mostly during the rainy season which comes into news. Based on this issue this paper reports a static and dynamic slope stability analysis of Sonapur which has been carried out in MIDAS GTS NX. The slope being highly unreachable due to terrain and thick vegetation in-situ test was not feasible considering the current scope available so disturbed soil sample was collected from the site for the determination of strength parameters. The strength parameters were so determined for varying relative density with further variation in water content. The slopes were analyzed considering plane strain condition for three slope heights of 5 m, 10 m and 20 m which were then further categorized based on slope angles 30, 40, 50, 60, and 70 considering the possible extent of steepness. Initially static analysis under dry state was performed then considering the worst case that can develop during rainy season the slopes were analyzed for fully saturated condition along with partial degree of saturation with an increase in the waterfront. Furthermore, dynamic analysis was performed considering the El-Centro Earthquake which had a magnitude of 6.7 and peak ground acceleration of 0.3569g at 2.14 sec for the slope which were found to be safe during static analysis under both dry and fully saturated condition. Some of the conclusions were slopes with inclination above 40 onwards were found to be highly vulnerable for slopes of height 10 m and above even under dry static condition. Maximum horizontal displacement showed an exponential increase with an increase in inclination from 30 to 70. The vulnerability of the slopes was seen to be further increased during rainy season as even slopes of minimal steepness of 30 for height 20 m was seen to be on the verge of failure. Also, during dynamic analysis slopes safe during static analysis were found to be highly vulnerable. Lastly, as a part of the study a comparative study on Strength Reduction Method (SRM) versus Limit Equilibrium Method (LEM) was also carried out and some of the advantages and disadvantages were figured out.

Keywords: dynamic analysis, factor of safety, slope stability, strength reduction method

Procedia PDF Downloads 248
25102 Empirical Green’s Function Technique for Accelerogram Synthesis: The Problem of the Use for Marine Seismic Hazard Assessment

Authors: Artem A. Krylov

Abstract:

Instrumental seismological researches in water areas are complicated and expensive, that leads to the lack of strong motion records in most offshore regions. In the same time the number of offshore industrial infrastructure objects, such as oil rigs, subsea pipelines, is constantly increasing. The empirical Green’s function technique proved to be very effective for accelerograms synthesis under the conditions of poorly described seismic wave propagation medium. But the selection of suitable small earthquake record in offshore regions as an empirical Green’s function is a problem because of short seafloor instrumental seismological investigation results usually with weak micro-earthquakes recordings. An approach based on moving average smoothing in the frequency domain is presented for preliminary processing of weak micro-earthquake records before using it as empirical Green’s function. The method results in significant waveform correction for modeled event. The case study for 2009 L’Aquila earthquake was used to demonstrate the suitability of the method. This work was supported by the Russian Foundation of Basic Research (project № 18-35-00474 mol_a).

Keywords: accelerogram synthesis, empirical Green's function, marine seismology, microearthquakes

Procedia PDF Downloads 308
25101 Electrostatic Cleaning System Integrated with Thunderon Brush for Lunar Dust Mitigation

Authors: Voss Harrigan, Korey Carter, Mohammad Reza Shaeri

Abstract:

Detrimental effects of lunar dust on space hardware, spacesuits, and astronauts’ health have been already identified during Apollo missions. Developing effective dust mitigation technologies is critically important for successful space exploration and related missions in NASA applications. In this study, an electrostatic cleaning system (ECS) integrated with a negatively ionized Thunderon brush was developed to mitigate small-sized lunar dust particles with diameters ranging from 0.04 µm to 35 µm, and the mean and median size of 7 µm and 5 µm, respectively. It was found that the frequency pulses of the negative ion generator caused particles to stick to the Thunderon bristles and repel between the pulses. The brush was used manually to ensure that particles were removed from areas where the ECS failed to mitigate the lunar simulant. The acquired data demonstrated that the developed system removed over 91-96% of the lunar dust particles. The present study was performed as a proof-of-concept to enhance the cleaning performance of ECSs by integrating a brushing process. Suggestions were made to further improve the performance of the developed technology through future research.

Keywords: lunar dust mitigation, electrostatic cleaning system, Brushing, Thunderon brush, cleaning rate

Procedia PDF Downloads 229
25100 Study the Dynamic Behavior of Irregular Buildings by the Analysis Method Accelerogram

Authors: Beciri Mohamed Walid

Abstract:

Some architectural conditions required some shapes often lead to an irregular distribution of masses, rigidities and resistances. The main object of the present study consists in estimating the influence of the irregularity both in plan and in elevation which presenting some structures on the dynamic characteristics and his influence on the behavior of this structures. To do this, it is necessary to make apply both dynamic methods proposed by the RPA99 (spectral modal method and method of analysis by accelerogram) on certain similar prototypes and to analyze the parameters measuring the answer of these structures and to proceed to a comparison of the results.

Keywords: structure, irregular, code, seismic, method, force, period

Procedia PDF Downloads 299
25099 Human Resource Management Challenges in Age of Artificial Intelligence: Methodology of Case Analysis

Authors: Olga Leontjeva

Abstract:

In the age of Artificial Intelligence (AI), some organization management approaches need to be adapted or changed. Human Resource Management (HRM) is a part of organization management that is under the managers' focus nowadays, because AI integration into organization activities brings some HRM-connected challenges. The topic became more significant during the crises of many organizations in the world caused by the coronavirus pandemic (COVID-19). The paper presents an approach, which will be used for the study that is going to be focused on the various case analysis. The author of the future study will analyze the cases of the organizations from Latvia and Spain that are grouped by the size, type of activity and area of business. The information for the cases will be collected through structured interviews and online surveys. The main result presented is the questionnaire developed that will be used for the study as well as the definition and description of sampling. The first round of the survey will be based on convenience sampling that is the main limitation of the study. To conclude, the approach developed will help to collect valid data if the organizations participating in the survey are ready to share their cases in depth, so the researchers could draw the right conclusions and generalize compared organizations’ cases. The questionnaire developed for the survey is applicable for both written online data collection as well as for the interviews. The case analysis will help to identify some HRM challenges that are connected to AI integration into organization activities such as management of different generation employees and their training peculiarities.

Keywords: age of artificial intelligence, case analysis, generation Y and Z employees, human resource management

Procedia PDF Downloads 157
25098 Compliance of Systematic Reviews in Ophthalmology with the PRISMA Statement

Authors: Seon-Young Lee, Harkiran Sagoo, Reem Farwana, Katharine Whitehurst, Alex Fowler, Riaz Agha

Abstract:

Background/Aims: Systematic reviews and meta-analysis are becoming increasingly important way of summarizing research evidence. Researches in ophthalmology may represent further challenges, due to their potential complexity in study design. The aim of our study was to determine the reporting quality of systematic reviews and meta-analysis in ophthalmology with the PRISMA statement, by assessing the articles published between 2010 and 2015 from five major journals with the highest impact factor. Methods: MEDLINE and EMBASE were used to search systematic reviews published between January 2010 and December 2015, in 5 major ophthalmology journals: Progress in Retinal and Eye Research, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology, Journal of the American Optometric Association. Screening, identification, and scoring of articles were performed independently by two teams, followed by statistical analysis including the median, range, and 95% CIs. Results: 115 articles were involved. The median PRISMA score was 15 of 27 items (56%), with a range of 5-26 (19-96%) and 95% CI 13.9-16.1 (51-60%). Compliance was highest in items related to the description of rationale (item 3,100%) and inclusion of a structured summary in the abstract (item 2, 90%), while poorest in indication of review protocol and registration (item 5, 9%), specification of risk of bias affecting the cumulative evidence (item 15, 24%) and description of clear objectives in introduction (item 4, 26%). Conclusion: The reporting quality of systematic reviews and meta-analysis in ophthalmology need significant improvement. While the use of PRISMA criteria as a guideline before journal submission is recommended, additional research identifying potential barriers may be required to improve the compliance to the PRISMA guidelines.

Keywords: systematic reviews, meta-analysis, research methodology, reporting quality, PRISMA, ophthalmology

Procedia PDF Downloads 251
25097 Strategic Evaluation of Existing Drainage System in Apalit, Pampanga

Authors: Jennifer de Jesus, Ares Baron Talusan, Steven Valerio

Abstract:

This paper aims to conduct an evaluation of the drainage system in a specific village in Apalit, Pampanga using the geographic information system to easily identify inadequate drainage lines that needs rehabilitation to aid in flooding problem in the area. The researchers will be utilizing two methods and software to be able to strategically assess each drainage line in the village– the two methods were the rational method and the Manning's Formula for Open Channel Flow and compared it to each other, and the software to be used was Google Earth Pro by 2020 Google LLC. The results must satisfy the statement QManning > QRational to be able to see if the specific line and section is adequate; otherwise, it is inadequate; dimensions needed to be recomputed until it became adequate. The use of the software is the visualization of data collected from the computations to clearly see in which areas the drainage lines were adequate or not. The researchers were then able to conclude that the drainage system should be considered inadequate, seeing as most of the lines are unable to accommodate certain intensities of rainfall. The researchers have also concluded that line rehabilitation is a must to proceed.

Keywords: strategic evaluation, drainage system, as-built plans, inadequacy, rainfall intensity-duration-frequency data, rational method, manning’s equation for open channel flow

Procedia PDF Downloads 109
25096 Regionalization of IDF Curves with L-Moments for Storm Events

Authors: Noratiqah Mohd Ariff, Abdul Aziz Jemain, Mohd Aftar Abu Bakar

Abstract:

The construction of Intensity-Duration-Frequency (IDF) curves is one of the most common and useful tools in order to design hydraulic structures and to provide a mathematical relationship between rainfall characteristics. IDF curves, especially those in Peninsular Malaysia, are often built using moving windows of rainfalls. However, these windows do not represent the actual rainfall events since the duration of rainfalls is usually prefixed. Hence, instead of using moving windows, this study aims to find regionalized distributions for IDF curves of extreme rainfalls based on storm events. Homogeneity test is performed on annual maximum of storm intensities to identify homogeneous regions of storms in Peninsular Malaysia. The L-moment method is then used to regionalized Generalized Extreme Value (GEV) distribution of these annual maximums and subsequently. IDF curves are constructed using the regional distributions. The differences between the IDF curves obtained and IDF curves found using at-site GEV distributions are observed through the computation of the coefficient of variation of root mean square error, mean percentage difference and the coefficient of determination. The small differences implied that the construction of IDF curves could be simplified by finding a general probability distribution of each region. This will also help in constructing IDF curves for sites with no rainfall station.

Keywords: IDF curves, L-moments, regionalization, storm events

Procedia PDF Downloads 509
25095 Hydroclean Smartbin Solution for Plastic Pollution Crisis

Authors: Anish Bhargava

Abstract:

By 2050, there will be more plastic than fish in our oceans. 51 trillion micro-plastics pollute our waters and contaminate the food on our plates, increasing the risk of tumours and diseases such as cancer. Our product is a solution to the ever-growing problem of plastic pollution. We call it the SmartBin. The SmartBin is a cylindrical device which will float just below the surface of the water, able to move with the aid of 4 water thrusters situated on the sides. As it floats, our SmartBin will suck water into itself and pump it out through the bottom. All waste is collected into a reusable filter including microplastics measuring down to 1.5mm. A speaker emitting sound at a frequency of 9 hertz ensures marine life stays away from the SmartBin. Featured along with our product is a smartphone app which will enable the user to designate an area for the SmartBin to cover on a satellite image. The SmartBin will then return to its start position near the shore, configured through the app. As global pressure to tackle water pollution continues to increase, environmental spending increases too. As our product provides an effective solution to this issue, we can seize the opportunity and scale our company. Our product is unparalleled. It can move at a high speed, covering a wide area rather than being restricted to one position. We target not only oceans and sea-shores, but also rivers, lakes, reservoirs and canals, as they are much easier to access and control.

Keywords: water, plastic, pollution, solution, hydroclean, smartbin, cleanup

Procedia PDF Downloads 196
25094 The Net as a Living Experience of Distance Motherhood within Italian Culture

Authors: C. Papapicco

Abstract:

Motherhood is an existential human relationship that lasts for the whole life and is always interwoven with subjectivity and culture. As a result of the brain drain, the motherhood becomes motherhood at distance. Starting from the hypothesis that re-signification of the mother at distance practices is culturally relevant; the research aims to understand the experience of mother at a distance in order to extrapolate the strategies of management of the empty nest. Specifically, the research aims to evaluate the experience of a brain drain’s mother, who created a blog that intends to take care of other parents at a distance. Actually, the blog is the only artifact symbol of the Italian culture of motherhood at distance. In the research, a Netnographic Analysis of the blog mammedicervelliinfuga.com is offered with the aim of understanding if the online world becomes an opportunity to manage the role of mother at a distance. A narrative interview with the blog creator was conducted and then the texts were analyzed by means of a Diatextual Analysis approach. It emerged that the migration projects of talented children take on different meanings and representations for parents. Thus, it is shown that the blog becomes a new form of understanding and practicing motherhood at a distance.

Keywords: brain drain, diatextual analysis, distance motherhood blog, online and offline narrations

Procedia PDF Downloads 120
25093 Flow: A Fourth Musical Element

Authors: James R. Wilson

Abstract:

Music is typically defined as having the attributes of melody, harmony, and rhythm. In this paper, a fourth element is proposed -"flow". "Flow" is a new dimension in music that has always been present but only recently identified and measured. The Adagio "Flow Machine" enables us to envision this component and even suggests a new approach to music theory and analysis. The Adagio was created specifically to measure the underlying “flow” in music. The Adagio is an entirely new way to experience and visualize the music, to assist in performing music (both as a conductor and/or performer), and to provide a whole new methodology for music analysis and theory. The Adagio utilizes musical “hit points”, such as a transition from one musical section to another (for example, in a musical composition utilizing the sonata form, a transition from the exposition to the development section) to help define the compositions flow rate. Once the flow rate is established, the Adagio can be used to determine if the composer/performer/conductor has correctly maintained the proper rate of flow throughout the performance. An example is provided using Mozart’s Piano Concerto Number 21. Working with the Adagio yielded an unexpected windfall; it was determined via an empirical study conducted at Nova University’s Biofeedback Lab that watching the Adagio helped volunteers participating in a controlled experiment recover from stressors significantly faster than the control group. The Adagio can be thought of as a new arrow in the Musicologist's quiver. It provides a new, unique way of viewing the psychological impact and esthetic effectiveness of music composition. Additionally, with the current worldwide access to multi-media via the internet, flow analysis can be performed and shared with others with little time and/or expense.

Keywords: musicology, music analysis, music flow, music therapy

Procedia PDF Downloads 159
25092 Value Engineering and Its Impact on Drainage Design Optimization for Penang International Airport Expansion

Authors: R.M. Asyraf, A. Norazah, S.M. Khairuddin, B. Noraziah

Abstract:

Designing a system at present requires a vital, challenging task; to ensure the design philosophy is maintained in economical ways. This paper perceived the value engineering (VE) approach applied in infrastructure works, namely stormwater drainage. This method is adopted in line as consultants have completed the detailed design. Function Analysis System Technique (FAST) diagram and VE job plan, information, function analysis, creative judgement, development, and recommendation phase are used to scrutinize the initial design of stormwater drainage. An estimated cost reduction using the VE approach of 2% over the initial proposal was obtained. This cost reduction is obtained from the design optimization of the drainage foundation and structural system, where the pile design and drainage base structure are optimized. Likewise, the design of the on-site detention tank (OSD) pump was revised and contribute to the cost reduction obtained. This case study shows that the VE approach can be an important tool in optimizing the design to reduce costs.

Keywords: value engineering, function analysis system technique, stormwater drainage, cost reduction

Procedia PDF Downloads 132
25091 Landmark Based Catch Trends Assessment of Gray Eel Catfish (Plotosus canius) at Mangrove Estuary in Bangladesh

Authors: Ahmad Rabby

Abstract:

The present study emphasizing the catch trends assessment of Gray eel catfish (Plotosus canius) that was scrutinized on the basis of monthly length frequency data collected from mangrove estuary, Bangladesh during January 2017 to December 2018. A total amount of 1298 specimens were collected to estimate the total length (TL) and weight (W) of P. canius ranged from 13.3 cm to 87.4 cm and 28 g to 5200 g, respectively. The length-weight relationship was W=0.006 L2.95 with R2=0.972 for both sexes. The von Bertalanffy growth function parameters were L∞=93.25 cm and K=0.28 yr-1, hypothetical age at zero length of t0=0.059 years and goodness of the fit of Rn=0.494. The growth performances indices for L∞ and W∞ were computed as Φ'=3.386 and Φ=1.84, respectively. The size at first sexual maturity was estimated in TL as 48.8 cm for pool sexes. The natural mortality was 0.51 yr-1 at average annual water surface temperature as 22 0C. The total instantaneous mortality was 1.24 yr-1 at CI95% of 0.105–1.42 (r2=0.986). While fishing mortality was 0.73 yr-1 and the current exploitation ratio as 0.59. The recruitment was continued throughout the year with one major peak during May-June was 17.20-17.96%. The Beverton-Holt yield per recruit model was analyzed by FiSAT-II, when tc was at 1.43 yr, the Fmax was estimated as 0.6 yr-1 and F0.1 was 0.33 yr-1. Current age at the first capture was approximately 0.6 year, however Fcurrent = 0.73 yr-1 which is beyond the F0.1 indicated that the current stock of P. canius of Bangladesh was overexploited.

Keywords: Plotosus canius, mangrove estuary, asymptotic length, FiSAT-II

Procedia PDF Downloads 140
25090 Reconstructability Analysis for Landslide Prediction

Authors: David Percy

Abstract:

Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.

Keywords: reconstructability analysis, machine learning, landslides, raster analysis

Procedia PDF Downloads 47
25089 Evolution of DNA-Binding With-One-Finger Transcriptional Factor Family in Diploid Cotton Gossypium raimondii

Authors: Waqas Shafqat Chattha, Muhammad Iqbal, Amir Shakeel

Abstract:

Transcriptional factors are proteins that play a vital role in regulating the transcription of target genes in different biological processes and are being widely studied in different plant species. In the current era of genomics, plant genomes sequencing has directed to the genome-wide identification, analyses and categorization of diverse transcription factor families and hence provide key insights into their structural as well as functional diversity. The DNA-binding with One Finger (DOF) proteins belongs to C2-C2-type zinc finger protein family. DOF proteins are plant-specific transcription factors implicated in diverse functions including seed maturation and germination, phytohormone signalling, light-mediated gene regulation, cotton-fiber elongation and responses of the plant to biotic as well as abiotic stresses. In this context, a genome-wide in-silico analysis of DOF TF family in diploid cotton species i.e. Gossypium raimondii has enabled us to identify 55 non-redundant genes encoding DOF proteins renamed as GrDofs (Gossypium raimondii Dof). Gene distribution studies have shown that all of the GrDof genes are unevenly distributed across 12 out of 13 G. raimondii chromosomes. The gene structure analysis illustrated that 34 out of 55 GrDof genes are intron-less while remaining 21 genes have a single intron. Protein sequence-based phylogenetic analysis of putative 55 GrDOFs has divided these proteins into 5 major groups with various paralogous gene pairs. Molecular evolutionary studies aided with the conserved domain as well as gene structure analysis suggested that segmental duplications were the principal contributors for the expansion of Dof genes in G. raimondii.

Keywords: diploid cotton , G. raimondii, phylogenetic analysis, transcription factor

Procedia PDF Downloads 131
25088 Digital Marketing Maturity Models: Overview and Comparison

Authors: Elina Bakhtieva

Abstract:

The variety of available digital tools, strategies and activities might confuse and disorient even an experienced marketer. This applies in particular to B2B companies, which are usually less flexible in uptaking of digital technology than B2C companies. B2B companies are lacking a framework that corresponds to the specifics of the B2B business, and which helps to evaluate a company’s capabilities and to choose an appropriate path. A B2B digital marketing maturity model helps to fill this gap. However, modern marketing offers no widely approved digital marketing maturity model, and thus, some marketing institutions provide their own tools. The purpose of this paper is building an optimized B2B digital marketing maturity model based on a SWOT (strengths, weaknesses, opportunities, and threats) analysis of existing models. The current study provides an analytical review of the existing digital marketing maturity models with open access. The results of the research are twofold. First, the provided SWOT analysis outlines the main advantages and disadvantages of existing models. Secondly, the strengths of existing digital marketing maturity models, helps to identify the main characteristics and the structure of an optimized B2B digital marketing maturity model. The research findings indicate that only one out of three analyzed models could be used as a separate tool. This study is among the first examining the use of maturity models in digital marketing. It helps businesses to choose between the existing digital marketing models, the most effective one. Moreover, it creates a base for future research on digital marketing maturity models. This study contributes to the emerging B2B digital marketing literature by providing a SWOT analysis of the existing digital marketing maturity models and suggesting a structure and main characteristics of an optimized B2B digital marketing maturity model.

Keywords: B2B digital marketing strategy, digital marketing, digital marketing maturity model, SWOT analysis

Procedia PDF Downloads 324
25087 Application of Neural Network in Portfolio Product Companies: Integration of Boston Consulting Group Matrix and Ansoff Matrix

Authors: M. Khajezadeh, M. Saied Fallah Niasar, S. Ali Asli, D. Davani Davari, M. Godarzi, Y. Asgari

Abstract:

This study aims to explore the joint application of both Boston and Ansoff matrices in the operational development of the product. We conduct deep analysis, by utilizing the Artificial Neural Network, to predict the position of the product in the market while the company is interested in increasing its share. The data are gathered from two industries, called hygiene and detergent. In doing so, the effort is being made by investigating the behavior of top player companies and, recommend strategic orientations. In conclusion, this combination analysis is appropriate for operational development; as well, it plays an important role in providing the position of the product in the market for both hygiene and detergent industries. More importantly, it will elaborate on the company’s strategies to increase its market share related to a combination of the Boston Consulting Group (BCG) Matrix and Ansoff Matrix.

Keywords: artificial neural network, portfolio analysis, BCG matrix, Ansoff matrix

Procedia PDF Downloads 128
25086 Potential and Techno-Economic Analysis of Hydrogen Production from Portuguese Solid Recovered Fuels

Authors: A. Ribeiro, N. Pacheco, M. Soares, N. Valério, L. Nascimento, A. Silva, C. Vilarinho, J. Carvalho

Abstract:

Hydrogen will play a key role in changing the current global energy paradigm, associated with the high use of fossil fuels and the release of greenhouse gases. This work intended to identify and quantify the potential of Solid Recovered Fuels (SFR) existing in Portugal and project the cost of hydrogen, produced through its steam gasification in different scenarios, associated with the size or capacity of the plant and the existence of carbon capture and storage (CCS) systems. Therefore, it was performed a techno-economic analysis simulation using an ASPEN base model, the H2A Hydrogen Production Model Version 3.2018. Regarding the production of SRF, it was possible to verify the annual production of more than 200 thousand tons of SRF in Portugal in 2019. The results of the techno-economic analysis simulations showed that in the scenarios containing a high (200,000 tons/year) and medium (40,000 tons/year) amount of SFR, the cost of hydrogen production was competitive concerning the current prices of hydrogen. The results indicate that scenarios 1 and 2, which use 200,000 tons of SRF per year, have lower hydrogen production values, 1.22 USD/kg H2 and 1.63 USD/kg H2, respectively. The cost of producing hydrogen without carbon capture and storage (CCS) systems in an average amount of SFR (40,000 tons/year) was 1.70 USD/kg H2. In turn, scenarios 5 (without CCS) and 6 (with CCS), which use only 683 tons of SFR from urban sources, have the highest costs, 6.54 USD/kg H2 and 908.97 USD/kg H2, respectively. Therefore, it was possible to conclude that there is a huge potential for the use of SRF for the production of hydrogen through steam gasification in Portugal.

Keywords: gasification, hydrogen, solid recovered fuels, techno-economic analysis, waste-to-energy

Procedia PDF Downloads 109
25085 A Method for False Alarm Recognition Based on Multi-Classification Support Vector Machine

Authors: Weiwei Cui, Dejian Lin, Leigang Zhang, Yao Wang, Zheng Sun, Lianfeng Li

Abstract:

Built-in test (BIT) is an important technology in testability field, and it is widely used in state monitoring and fault diagnosis. With the improvement of modern equipment performance and complexity, the scope of BIT becomes larger, and it leads to the emergence of false alarm problem. The false alarm makes the health assessment unstable, and it reduces the effectiveness of BIT. The conventional false alarm suppression methods such as repeated test and majority voting cannot meet the requirement for a complicated system, and the intelligence algorithms such as artificial neural networks (ANN) are widely studied and used. However, false alarm has a very low frequency and small sample, yet a method based on ANN requires a large size of training sample. To recognize the false alarm, we propose a method based on multi-classification support vector machine (SVM) in this paper. Firstly, we divide the state of a system into three states: healthy, false-alarm, and faulty. Then we use multi-classification with '1 vs 1' policy to train and recognize the state of a system. Finally, an example of fault injection system is taken to verify the effectiveness of the proposed method by comparing ANN. The result shows that the method is reasonable and effective.

Keywords: false alarm, fault diagnosis, SVM, k-means, BIT

Procedia PDF Downloads 140