Search results for: isolation forest method
14180 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking
Authors: Noga Bregman
Abstract:
Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves
Procedia PDF Downloads 5214179 Comparison between Two Software Packages GSTARS4 and HEC-6 about Prediction of the Sedimentation Amount in Dam Reservoirs and to Estimate Its Efficient Life Time in the South of Iran
Authors: Fatemeh Faramarzi, Hosein Mahjoob
Abstract:
Building dams on rivers for utilization of water resources causes problems in hydrodynamic equilibrium and results in leaving all or part of the sediments carried by water in dam reservoir. This phenomenon has also significant impacts on water and sediment flow regime and in the long term can cause morphological changes in the environment surrounding the river, reducing the useful life of the reservoir which threatens sustainable development through inefficient management of water resources. In the past, empirical methods were used to predict the sedimentation amount in dam reservoirs and to estimate its efficient lifetime. But recently the mathematical and computational models are widely used in sedimentation studies in dam reservoirs as a suitable tool. These models usually solve the equations using finite element method. This study compares the results from tow software packages, GSTARS4 & HEC-6, in the prediction of the sedimentation amount in Dez dam, southern Iran. The model provides a one-dimensional, steady-state simulation of sediment deposition and erosion by solving the equations of momentum, flow and sediment continuity and sediment transport. GSTARS4 (Generalized Sediment Transport Model for Alluvial River Simulation) which is based on a one-dimensional mathematical model that simulates bed changes in both longitudinal and transverse directions by using flow tubes in a quasi-two-dimensional scheme to calibrate a period of 47 years and forecast the next 47 years of sedimentation in Dez Dam, Southern Iran. This dam is among the highest dams all over the world (with its 203 m height), and irrigates more than 125000 square hectares of downstream lands and plays a major role in flood control in the region. The input data including geometry, hydraulic and sedimentary data, starts from 1955 to 2003 on a daily basis. To predict future river discharge, in this research, the time series data were assumed to be repeated after 47 years. Finally, the obtained result was very satisfactory in the delta region so that the output from GSTARS4 was almost identical to the hydrographic profile in 2003. In the Dez dam due to the long (65 km) and a large tank, the vertical currents are dominant causing the calculations by the above-mentioned method to be inaccurate. To solve this problem, we used the empirical reduction method to calculate the sedimentation in the downstream area which led to very good answers. Thus, we demonstrated that by combining these two methods a very suitable model for sedimentation in Dez dam for the study period can be obtained. The present study demonstrated successfully that the outputs of both methods are the same.Keywords: Dez Dam, prediction, sedimentation, water resources, computational models, finite element method, GSTARS4, HEC-6
Procedia PDF Downloads 31314178 The Relation between the Organizational Trust Level and Organizational Justice Perceptions of Staff in Konya Municipality: A Theoretical and Empirical Study
Authors: Handan Ertaş
Abstract:
The aim of the study is to determine the relationship between organizational trust level and organizational justice of Municipality officials. Correlational method has been used via descriptive survey model and Organizational Justice Perception Scale, Organizational Trust Inventory and Interpersonal Trust Scale have been applied to 353 participants who work in Konya Metropolitan Municipality and central district municipalities in the study. Frequency as statistical method, Independent Samples t test for binary groups, One Way-ANOVA analyses for multi-groups and Pearson Correlation analysis have been used to determine the relation in the data analysis process. It has been determined in the outcomes of the study that participants have high level of organizational trust, “Interpersonal Trust” is in the first place and there is a significant difference in the favor of male officials in terms of Trust on the Organization Itself and Interpersonal Trust. It has also been understood that officials in district municipalities have higher perception level in all dimensions, there is a significant difference in Trust on the Organization sub-dimension and work status is an important factor on organizational trust perception. Moreover, the study has shown that organizational justice implementations are important in raising trust of official on the organization, administrator and colleagues, and there is a parallel relation between Organizational Trust components and Organizational Trust dimensions.Keywords: organizational trust level, organizational justice perceptions, staff, Konya
Procedia PDF Downloads 34714177 The Significance of Awareness about Gender Diversity for the Future of Work: A Multi-Method Study of Organizational Structures and Policies Considering Trans and Gender Diversity
Authors: Robin C. Ladwig
Abstract:
The future of work becomes less predictable, which requires increasing the adaptability of organizations to social and work changes. Society is transforming regarding gender identity in the sense that more people come forward to identify as trans and gender diverse (TGD). Organizations are ill-equipped to provide a safe and encouraging work environment by lacking inclusive organizational structures. The qualitative multi-method research about TGD inclusivity in the workplace explores the enablers and barriers for TGD individuals to satisfactory engage in the work environment and organizational culture. Furthermore, these TGD insights are analyzed about their organizational implications and awareness from a leadership and management perspective. The semi-structured online interviews with TGD individuals and the photo-elicit open-ended questionnaire addressed to leadership and management in diversity, career development, and human resources have been analyzed with a critical grounded theory approach. Findings demonstrated the significance of TGD voices, the support of leadership and management, as well as the synergy between voices and leadership. Hence, it indicates practical implications such as the revision of exclusive language used in policies, data collection, or communication and reconsideration of organizational decision-making by leaders to include TGD voices.Keywords: future of work, occupational identity, organisational decision-making, trans and gender diverse identity
Procedia PDF Downloads 12714176 Model Order Reduction of Complex Airframes Using Component Mode Synthesis for Dynamic Aeroelasticity Load Analysis
Authors: Paul V. Thomas, Mostafa S. A. Elsayed, Denis Walch
Abstract:
Airframe structural optimization at different design stages results in new mass and stiffness distributions which modify the critical design loads envelop. Determination of aircraft critical loads is an extensive analysis procedure which involves simulating the aircraft at thousands of load cases as defined in the certification requirements. It is computationally prohibitive to use a Global Finite Element Model (GFEM) for the load analysis, hence reduced order structural models are required which closely represent the dynamic characteristics of the GFEM. This paper presents the implementation of Component Mode Synthesis (CMS) method for the generation of high fidelity Reduced Order Model (ROM) of complex airframes. Here, sub-structuring technique is used to divide the complex higher order airframe dynamical system into a set of subsystems. Each subsystem is reduced to fewer degrees of freedom using matrix projection onto a carefully chosen reduced order basis subspace. The reduced structural matrices are assembled for all the subsystems through interface coupling and the dynamic response of the total system is solved. The CMS method is employed to develop the ROM of a Bombardier Aerospace business jet which is coupled with an aerodynamic model for dynamic aeroelasticity loads analysis under gust turbulence. Another set of dynamic aeroelastic loads is also generated employing a stick model of the same aircraft. Stick model is the reduced order modelling methodology commonly used in the aerospace industry based on stiffness generation by unitary loading application. The extracted aeroelastic loads from both models are compared against those generated employing the GFEM. Critical loads Modal participation factors and modal characteristics of the different ROMs are investigated and compared against those of the GFEM. Results obtained show that the ROM generated using Craig Bampton CMS reduction process has a superior dynamic characteristics compared to the stick model.Keywords: component mode synthesis, craig bampton reduction method, dynamic aeroelasticity analysis, model order reduction
Procedia PDF Downloads 20914175 Detection of Important Biological Elements in Drug-Drug Interaction Occurrence
Authors: Reza Ferdousi, Reza Safdari, Yadollah Omidi
Abstract:
Drug-drug interactions (DDIs) are main cause of the adverse drug reactions and nature of the functional and molecular complexity of drugs behavior in human body make them hard to prevent and treat. With the aid of new technologies derived from mathematical and computational science the DDIs problems can be addressed with minimum cost and efforts. Market basket analysis is known as powerful method to identify co-occurrence of thing to discover patterns and frequency of the elements. In this research, we used market basket analysis to identify important bio-elements in DDIs occurrence. For this, we collected all known DDIs from DrugBank. The obtained data were analyzed by market basket analysis method. We investigated all drug-enzyme, drug-carrier, drug-transporter and drug-target associations. To determine the importance of the extracted bio-elements, extracted rules were evaluated in terms of confidence and support. Market basket analysis of the over 45,000 known DDIs reveals more than 300 important rules that can be used to identify DDIs, CYP 450 family were the most frequent shared bio-elements. We applied extracted rules over 2,000,000 unknown drug pairs that lead to discovery of more than 200,000 potential DDIs. Analysis of the underlying reason behind the DDI phenomena can help to predict and prevent DDI occurrence. Ranking of the extracted rules based on strangeness of them can be a supportive tool to predict the outcome of an unknown DDI.Keywords: drug-drug interaction, market basket analysis, rule discovery, important bio-elements
Procedia PDF Downloads 31014174 Synthesis of Belite Cements at Low Temperature from Silica Fume and Natural Commercial Zeolite
Authors: Tatiana L. Avalos-Rendon, Elias A. Pasten Chelala, Carlos J. Mendoza EScobedo, Ignacio A. Figueroa, Victor H. Lara, Luis M. Palacios-Romero
Abstract:
The cement industry is facing cost increments in energy supply, requirements for reduction of CO₂, and insufficient supply of raw materials of good quality. According to all these environmental issues, cement industry must change its consumption patterns and reduce CO₂ emissions to the atmosphere. This can be achieved by generating environmental consciousness, which encourages the use of industrial by-products and/or recycling for the production of cement, as well as alternate, environment-friendly methods of synthesis which reduce CO₂. Calcination is the conventional method for the obtainment of Portland cement clinker. This method consists of grinding and mixing of raw materials (limestone, clay, etc.) in an adequate dosage. Resulting mix has a clinkerization temperature of 1450 °C so that the formation of the main component occur: alite (Ca₃SiO₅, C₃S). Considering that the energy required to produce C₃S is 1810 kJ kg -1, calcination method for the obtainment of clinker represents two major disadvantages: long thermal treatment and elevated temperatures of synthesis, both of which cause high emissions of carbon dioxide (CO₂) to the atmosphere. Belite Portland clinker is characterized by having a low content of calcium oxide (CaO), causing the presence of alite to diminish and favoring the formation of belite (β-Ca₂SiO₄, C₂S), so production of clinker requires a reduced energy consumption (1350 kJ kg-1), releasing less CO₂ to the atmosphere. Conventionally, β-Ca₂SiO₄ is synthetized by the calcination of calcium carbonate (CaCO₃) and silicon dioxide (SiO₂) through the reaction in solid state at temperatures greater than 1300 °C. Resulting belite shows low hydraulic reactivity. Therefore, this study concerns a new simple modified combustion method for the synthesis of two belite cements at low temperatures (1000 °C). Silica fume, as subproduct of metallurgic industry and commercial natural zeolite were utilized as raw materials. These are considered low-cost materials and were utilized with no additional purification process. Belite cements properties were characterized by XRD, SEM, EDS and BET techniques. Hydration capacity of belite cements was calculated while the mechanical strength was determined in ordinary Portland cement specimens (PC) with a 10% partial replacement of the belite cements obtained. Results showed belite cements presented relatively high surface áreas, at early ages mechanical strengths similar to those of alite cement and comparable to strengths of belite cements obtained by different synthesis methods. Cements obtained in this work present good hydraulic reactivity properties.Keywords: belite, silica fume, zeolite, hydraulic reactivity
Procedia PDF Downloads 34714173 Frequency Analysis of Minimum Ecological Flow and Gage Height in Indus River Using Maximum Likelihood Estimation
Authors: Tasir Khan, Yejuan Wan, Kalim Ullah
Abstract:
Hydrological frequency analysis has been conducted to estimate the minimum flow elevation of the Indus River in Pakistan to protect the ecosystem. The Maximum likelihood estimation (MLE) technique is used to estimate the best-fitted distribution for Minimum Ecological Flows at nine stations of the Indus River in Pakistan. The four selected distributions, Generalized Extreme Value (GEV) distribution, Generalized Logistics (GLO) distribution, Generalized Pareto (GPA) distribution, and Pearson type 3 (PE3) are fitted in all sites, usually used in hydro frequency analysis. Compare the performance of these distributions by using the goodness of fit tests, such as the Kolmogorov Smirnov test, Anderson darling test, and chi-square test. The study concludes that the Maximum Likelihood Estimation (MLE) method recommended that GEV and GPA are the most suitable distributions which can be effectively applied to all the proposed sites. The quantiles are estimated for the return periods from 5 to 1000 years by using MLE, estimations methods. The MLE is the robust method for larger sample sizes. The results of these analyses can be used for water resources research, including water quality management, designing irrigation systems, determining downstream flow requirements for hydropower, and the impact of long-term drought on the country's aquatic system.Keywords: minimum ecological flow, frequency distribution, indus river, maximum likelihood estimation
Procedia PDF Downloads 7714172 Application of Neutron-Gamma Technologies for Soil Elemental Content Determination and Mapping
Authors: G. Yakubova, A. Kavetskiy, S. A. Prior, H. A. Torbert
Abstract:
In-situ soil carbon determination over large soil surface areas (several hectares) is required in regard to carbon sequestration and carbon credit issues. This capability is important for optimizing modern agricultural practices and enhancing soil science knowledge. Collecting and processing representative field soil cores for traditional laboratory chemical analysis is labor-intensive and time-consuming. The neutron-stimulated gamma analysis method can be used for in-situ measurements of primary elements in agricultural soils (e.g., Si, Al, O, C, Fe, and H). This non-destructive method can assess several elements in large soil volumes with no need for sample preparation. Neutron-gamma soil elemental analysis utilizes gamma rays issued from different neutron-nuclei interactions. This process has become possible due to the availability of commercial portable pulse neutron generators, high-efficiency gamma detectors, reliable electronics, and measurement/data processing software complimented by advances in state-of-the-art nuclear physics methods. In Pulsed Fast Thermal Neutron Analysis (PFTNA), soil irradiation is accomplished using a pulsed neutron flux, and gamma spectra acquisition occurs both during and between pulses. This method allows the inelastic neutron scattering (INS) gamma spectrum to be separated from the thermal neutron capture (TNC) spectrum. Based on PFTNA, a mobile system for field-scale soil elemental determinations (primarily carbon) was developed and constructed. Our scanning methodology acquires data that can be directly used for creating soil elemental distribution maps (based on ArcGIS software) in a reasonable timeframe (~20-30 hectares per working day). Created maps are suitable for both agricultural purposes and carbon sequestration estimates. The measurement system design, spectra acquisition process, strategy for acquiring field-scale carbon content data, and mapping of agricultural fields will be discussed.Keywords: neutron gamma analysis, soil elemental content, carbon sequestration, carbon credit, soil gamma spectroscopy, portable neutron generators, ArcMap mapping
Procedia PDF Downloads 9014171 Design and Construction of a Home-Based, Patient-Led, Therapeutic, Post-Stroke Recovery System Using Iterative Learning Control
Authors: Marco Frieslaar, Bing Chu, Eric Rogers
Abstract:
Stroke is a devastating illness that is the second biggest cause of death in the world (after heart disease). Where it does not kill, it leaves survivors with debilitating sensory and physical impairments that not only seriously harm their quality of life, but also cause a high incidence of severe depression. It is widely accepted that early intervention is essential for recovery, but current rehabilitation techniques largely favor hospital-based therapies which have restricted access, expensive and specialist equipment and tend to side-step the emotional challenges. In addition, there is insufficient funding available to provide the long-term assistance that is required. As a consequence, recovery rates are poor. The relatively unexplored solution is to develop therapies that can be harnessed in the home and are formulated from technologies that already exist in everyday life. This would empower individuals to take control of their own improvement and provide choice in terms of when and where they feel best able to undertake their own healing. This research seeks to identify how effective post-stroke, rehabilitation therapy can be applied to upper limb mobility, within the physical context of a home rather than a hospital. This is being achieved through the design and construction of an automation scheme, based on iterative learning control and the Riener muscle model, that has the ability to adapt to the user and react to their level of fatigue and provide tangible physical recovery. It utilizes a SMART Phone and laptop to construct an iterative learning control (ILC) system, that monitors upper arm movement in three dimensions, as a series of exercises are undertaken. The equipment generates functional electrical stimulation to assist in muscle activation and thus improve directional accuracy. In addition, it monitors speed, accuracy, areas of motion weakness and similar parameters to create a performance index that can be compared over time and extrapolated to establish an independent and objective assessment scheme, plus an approximate estimation of predicted final outcome. To further extend its assessment capabilities, nerve conduction velocity readings are taken by the software, between the shoulder and hand muscles. This is utilized to measure the speed of response of neuron signal transfer along the arm and over time, an online indication of regeneration levels can be obtained. This will prove whether or not sufficient training intensity is being achieved even before perceivable movement dexterity is observed. The device also provides the option to connect to other users, via the internet, so that the patient can avoid feelings of isolation and can undertake movement exercises together with others in a similar position. This should create benefits not only for the encouragement of rehabilitation participation, but also an emotional support network potential. It is intended that this approach will extend the availability of stroke recovery options, enable ease of access at a low cost, reduce susceptibility to depression and through these endeavors, enhance the overall recovery success rate.Keywords: home-based therapy, iterative learning control, Riener muscle model, SMART phone, stroke rehabilitation
Procedia PDF Downloads 26414170 Strategies for Drought Adpatation and Mitigation via Wastewater Management
Authors: Simrat Kaur, Fatema Diwan, Brad Reddersen
Abstract:
The unsustainable and injudicious use of natural renewable resources beyond the self-replenishment limits of our planet has proved catastrophic. Most of the Earth’s resources, including land, water, minerals, and biodiversity, have been overexploited. Owing to this, there is a steep rise in the global events of natural calamities of contrasting nature, such as torrential rains, storms, heat waves, rising sea levels, and megadroughts. These are all interconnected through common elements, namely oceanic currents and land’s the green cover. The deforestation fueled by the ‘economic elites’ or the global players have already cleared massive forests and ecological biomes in every region of the globe, including the Amazon. These were the natural carbon sinks prevailing and performing CO2 sequestration for millions of years. The forest biomes have been turned into mono cultivation farms to produce feedstock crops such as soybean, maize, and sugarcane; which are one of the biggest green house gas emitters. Such unsustainable agriculture practices only provide feedstock for livestock and food processing industries with huge carbon and water footprints. These are two main factors that have ‘cause and effect’ relationships in the context of climate change. In contrast to organic and sustainable farming, the mono-cultivation practices to produce food, fuel, and feedstock using chemicals devoid of the soil of its fertility, abstract surface, and ground waters beyond the limits of replenishment, emit green house gases, and destroy biodiversity. There are numerous cases across the planet where due to overuse; the levels of surface water reservoir such as the Lake Mead in Southwestern USA and ground water such as in Punjab, India, have deeply shrunk. Unlike the rain fed food production system on which the poor communities of the world relies; the blue water (surface and ground water) dependent mono-cropping for industrial and processed food create water deficit which put the burden on the domestic users. Excessive abstraction of both surface and ground waters for high water demanding feedstock (soybean, maize, sugarcane), cereal crops (wheat, rice), and cash crops (cotton) have a dual and synergistic impact on the global green house gas emissions and prevalence of megadroughts. Both these factors have elevated global temperatures, which caused cascading events such as soil water deficits, flash fires, and unprecedented burning of the woods, creating megafires in multiple continents, namely USA, South America, Europe, and Australia. Therefore, it is imperative to reduce the green and blue water footprints of agriculture and industrial sectors through recycling of black and gray waters. This paper explores various opportunities for successful implementation of wastewater management for drought preparedness in high risk communities.Keywords: wastewater, drought, biodiversity, water footprint, nutrient recovery, algae
Procedia PDF Downloads 10014169 Optimizing Design Works in Construction Consultant Company: A Knowledge-Based Application
Authors: Phan Nghiem Vu, Le Tuan Vu, Ta Quang Tai
Abstract:
The optimal construction design used during the execution of a construction project is a key factor in determining high productivity and customer satisfaction, however, this management process sometimes is carried out without care and the systematic method that it deserves, bringing negative consequences. This study proposes a knowledge management (KM) approach that will enable the intelligent use of experienced and acknowledged engineers to improve the management of construction design works for a project. Then a knowledge-based application to support this decision-making process is proposed and described. To define and design the system for the application, semi-structured interviews were conducted within five construction consulting organizations with the purpose of studying the way that the method’ optimizing process is implemented in practice and the knowledge supported with it. A system of an optimizing construction design works (OCDW) based on knowledge was developed then validated with construction experts. The OCDW was liked as a valuable tool for construction design works’ optimization, by supporting organizations to generate a corporate memory on this issue, reducing the reliance on individual knowledge and also the subjectivity of the decision-making process. The benefits are described as provided by the performance support system, reducing costs and time, improving product design quality, satisfying customer requirements, expanding the brand organization.Keywords: optimizing construction design work, construction consultant organization, knowledge management, knowledge-based application
Procedia PDF Downloads 12914168 Urban Art as an Identity Branding of Kampong Ketandan Surabaya
Authors: R. A. Retno Hastijanti, David Agus Sagita, Arum Lintang Cahyani, Tectona Radike, Andreas Suluh Putra
Abstract:
Surabaya, is one of the oldest cities in Indonesia. Most of the old quarter city of Surabaya is an ancient Kampong. Ketandan is one ancient Kampong in the center of Surabaya, surrounded by a thriving trade area. These conditions make Kampong vulnerably degraded of environmental quality and tended to lose their cultural identity. Norms and values eroded by the rapid development of its local surroundings. Through Kampong conservation programs, Surabaya city government established Ketandan as one of the urban heritage. To achieve the ideal condition of urban heritage, public participation is required. One thing that can generate a motivation for Kampong Ketandan community participation is to rediscover the identity of Kampong Ketandan. This research aims to explore the appropriate method to rediscover the identity of Kampong Ketandan. Through qualitative research methods, based on observations and focus group discussions, it was concluded that mural mentoring program was the best method that can be accepted by the Kampong community to rediscover their identity. Mural as one of the urban art form, able to motivate Kampong community to express their self and bring an icon to their Kampong. The benefits of this research are to provide input to the city government and the private sector to preserve urban heritage, moreover, to transform an urban heritage into a productive space in urban areas in order to enhance city revenues.Keywords: Kampong, Kampong Ketandan, mural, Surabaya, urban, urban heritage, urban art
Procedia PDF Downloads 33214167 Modal Analysis of FGM Plates Using Finite Element Method
Authors: S. J. Shahidzadeh Tabatabaei, A. M. Fattahi
Abstract:
Modal analysis of an FGM plate containing the ceramic phase of Al2O3 and metal phase of stainless steel 304 was performed using ABAQUS, with the assumptions that the material has an elastic mechanical behavior and its Young modulus and density are varying in thickness direction. For this purpose, a subroutine was written in FORTRAN and linked with ABAQUS. First, a simulation was performed in accordance to other researcher’s model, and then after comparing the obtained results, the accuracy of the present study was verified. The obtained results for natural frequency and mode shapes indicate good performance of user-written subroutine as well as FEM model used in present study. After verification of obtained results, the effect of clamping condition and the material type (i.e. the parameter n) was investigated. In this respect, finite element analysis was carried out in fully clamped condition for different values of n. The results indicate that the natural frequency decreases with increase of n, since with increase of n, the amount of ceramic phase in FGM plate decreases, while the amount of metal phase increases, leading to decrease of the plate stiffness and hence, natural frequency, as the Young modulus of Al2O3 is equal to 380 GPa and the Young modulus of stainless steel 304 is equal to 207 GPa.Keywords: FGM plates, modal analysis, natural frequency, finite element method
Procedia PDF Downloads 34214166 Prevalence and the Results of the Czech Nationwide Survey and Personality Traits of Adolescence Playing Computer Games
Authors: Jaroslava Sucha, Martin Dolejs, Helena Pipova, Panajotis Cakirpaloglu
Abstract:
The paper introduces the research project which is focused on evaluating the level of pathological relation towards computer or video games playing (including any games played by using a screen such as a mobile or a tablet). The study involves representative sample of the Czech adolescents between ages 11 and 19. This poster presents the psychometric indicators of the new psychologic assessment method (mean, standard deviation, reliability, validity) which will be able to detect an acceptable level of games’ playing and at the same time will detect and describe the level of gaming which might be potentially risky. The prevalence of risky computer game playing at Czech adolescents in age 11 to 19 will be mentioned. The research study also aims to describe the personality profile of the problematic players with respect to the digital games. The research area will encompass risky behaviour, aggression, the level of self-esteem, impulsivity, anxiety and depression. The contribution will introduce a new test method for the assessment of pathological playing computer games. The research will give the first screening information of playing computer games in the Czech Republic by adolescents between 11-19 years. The results clarify what relationship exists between playing computer games and selected personality characteristics (it will describe personality of the gamer, who is in the category of ‘pathological playing computer games’).Keywords: adolescence, computer games, personality traits, risk behaviour
Procedia PDF Downloads 23914165 Synthesis of Amorphous Nanosilica Anode Material from Philippine Waste Rice Hull for Lithium Battery Application
Authors: Emie A. Salamangkit-Mirasol, Rinlee Butch M. Cervera
Abstract:
Rice hull or rice husk (RH) is an agricultural waste obtained from milling rice grains. Since RH has no commercial value and is difficult to use in agriculture, its volume is often reduced through open field burning which is an environmental hazard. In this study, amorphous nanosilica from Philippine waste RH was prepared via acid precipitation method. The synthesized samples were fully characterized for its microstructural properties. X-ray diffraction pattern reveals that the structure of the prepared sample is amorphous in nature while Fourier transform infrared spectrum showed the different vibration bands of the synthesized sample. Scanning electron microscopy (SEM) and particle size analysis (PSA) confirmed the presence of agglomerated silica particles. On the other hand, transmission electron microscopy (TEM) revealed an amorphous sample with grain sizes of about 5 to 20 nanometer range and has about 95 % purity according to EDS analyses. The elemental mapping also suggests that leaching of rice hull ash effectively removed the metallic impurity such as potassium element in the material. Hence, amorphous nanosilica was successfully prepared via a low-cost acid precipitation method from Philippine waste rice hull. In addition, initial electrode performance of the synthesized samples as an anode material in Lithium Battery have been investigated.Keywords: agricultural waste, anode material, nanosilica, rice hull
Procedia PDF Downloads 28314164 Staphylococcus argenteus: An Emerging Subclinical Bovine Mastitis Pathogen in Thailand
Authors: Natapol Pumipuntu
Abstract:
Staphylococcus argenteus is the emerging species of S. aureus complex. It was generally misidentified as S. aureus by standard techniques and their features. S. argenteus is possibly emerging in both humans and animals, as well as increasing worldwide distribution. The objective of this study was to differentiate and identify S. argenteus from S. aureus, which has been collected and isolated from milk samples of subclinical bovine mastitis cases in Maha Sarakham province, Northeastern of Thailand. Twenty-one isolates of S. aureus, which confirmed by conventional methods and immune-agglutination method were analyzed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) and multilocus sequence typing (MLST). The result from MALDI-TOF MS and MLST showed 6 from 42 isolates were confirmed as S. argenteus, and 36 isolates were S. aureus, respectively. This study indicated that the identification and classification method by using MALDI-TOF MS and MLST could accurately differentiate the emerging species, S. argenteus, from S. aureus complex which usually misdiagnosed. In addition, the identification of S. argenteus seems to be very limited despite the fact that it may be the important causative pathogen in bovine mastitis as well as pathogenic bacteria in food and milk. Therefore, it is very necessary for both bovine medicine and veterinary public health to emphasize and recognize this bacterial pathogen as the emerging disease of Staphylococcal bacteria and need further study about S. argenteus infection.Keywords: Staphylococcus argenteus, subclinical bovine mastitis, Staphylococcus aureus complex, mass spectrometry, MLST
Procedia PDF Downloads 15114163 Comprehensive Machine Learning-Based Glucose Sensing from Near-Infrared Spectra
Authors: Bitewulign Mekonnen
Abstract:
Context: This scientific paper focuses on the use of near-infrared (NIR) spectroscopy to determine glucose concentration in aqueous solutions accurately and rapidly. The study compares six different machine learning methods for predicting glucose concentration and also explores the development of a deep learning model for classifying NIR spectra. The objective is to optimize the detection model and improve the accuracy of glucose prediction. This research is important because it provides a comprehensive analysis of various machine-learning techniques for estimating aqueous glucose concentrations. Research Aim: The aim of this study is to compare and evaluate different machine-learning methods for predicting glucose concentration from NIR spectra. Additionally, the study aims to develop and assess a deep-learning model for classifying NIR spectra. Methodology: The research methodology involves the use of machine learning and deep learning techniques. Six machine learning regression models, including support vector machine regression, partial least squares regression, extra tree regression, random forest regression, extreme gradient boosting, and principal component analysis-neural network, are employed to predict glucose concentration. The NIR spectra data is randomly divided into train and test sets, and the process is repeated ten times to increase generalization ability. In addition, a convolutional neural network is developed for classifying NIR spectra. Findings: The study reveals that the SVMR, ETR, and PCA-NN models exhibit excellent performance in predicting glucose concentration, with correlation coefficients (R) > 0.99 and determination coefficients (R²)> 0.985. The deep learning model achieves high macro-averaging scores for precision, recall, and F1-measure. These findings demonstrate the effectiveness of machine learning and deep learning methods in optimizing the detection model and improving glucose prediction accuracy. Theoretical Importance: This research contributes to the field by providing a comprehensive analysis of various machine-learning techniques for estimating glucose concentrations from NIR spectra. It also explores the use of deep learning for the classification of indistinguishable NIR spectra. The findings highlight the potential of machine learning and deep learning in enhancing the prediction accuracy of glucose-relevant features. Data Collection and Analysis Procedures: The NIR spectra and corresponding references for glucose concentration are measured in increments of 20 mg/dl. The data is randomly divided into train and test sets, and the models are evaluated using regression analysis and classification metrics. The performance of each model is assessed based on correlation coefficients, determination coefficients, precision, recall, and F1-measure. Question Addressed: The study addresses the question of whether machine learning and deep learning methods can optimize the detection model and improve the accuracy of glucose prediction from NIR spectra. Conclusion: The research demonstrates that machine learning and deep learning methods can effectively predict glucose concentration from NIR spectra. The SVMR, ETR, and PCA-NN models exhibit superior performance, while the deep learning model achieves high classification scores. These findings suggest that machine learning and deep learning techniques can be used to improve the prediction accuracy of glucose-relevant features. Further research is needed to explore their clinical utility in analyzing complex matrices, such as blood glucose levels.Keywords: machine learning, signal processing, near-infrared spectroscopy, support vector machine, neural network
Procedia PDF Downloads 9414162 Characteristics of Bio-hybrid Hydrogel Materials with Prolonged Release of the Model Active Substance as Potential Wound Dressings
Authors: Katarzyna Bialik-Wąs, Klaudia Pluta, Dagmara Malina, Małgorzata Miastkowska
Abstract:
In recent years, biocompatible hydrogels have been used more and more in medical applications, especially as modern dressings and drug delivery systems. The main goal of this research was the characteristics of bio-hybrid hydrogel materials incorporated with the nanocarrier-drug system, which enable the release in a gradual and prolonged manner, up to 7 days. Therefore, the use of such a combination will provide protection against mechanical damage and adequate hydration. The proposed bio-hybrid hydrogels are characterized by: transparency, biocompatibility, good mechanical strength, and the dual release system, which allows for gradual delivery of the active substance, even up to 7 days. Bio-hybrid hydrogels based on sodium alginate (SA), poly(vinyl alcohol) (PVA), glycerine, and Aloe vera solution (AV) were obtained through the chemical crosslinking method using poly(ethylene glycol) diacrylate as a crosslinking agent. Additionally, a nanocarrier-drug system was incorporated into SA/PVA/AV hydrogel matrix. Here, studies were focused on the release profiles of active substances from bio-hybrid hydrogels using the USP4 method (DZF II Flow-Through System, Erweka GmbH, Langen, Germany). The equipment incorporated seven in-line flow-through diffusion cells. The membrane was placed over support with an orifice of 1,5 cm in diameter (diffusional area, 1.766 cm²). All the cells were placed in a cell warmer connected with the Erweka heater DH 2000i and the Erweka piston pump HKP 720. The piston pump transports the receptor fluid via seven channels to the flow-through cells and automatically adapts the setting of the flow rate. All volumes were measured by gravimetric methods by filling the chambers with Milli-Q water and assuming a density of 1 g/ml. All the determinations were made in triplicate for each cell. The release study of the model active substance was carried out using a regenerated cellulose membrane Spectra/Por®Dialysis Membrane MWCO 6-8,000 Carl Roth® Company. These tests were conducted in buffer solutions – PBS at pH 7.4. A flow rate of receptor fluid of about 4 ml /1 min was selected. The experiments were carried out for 7 days at a temperature of 37°C. The released concentration of the model drug in the receptor solution was analyzed using UV-Vis spectroscopy (Perkin Elmer Company). Additionally, the following properties of the modified materials were studied: physicochemical, structural (FT-IR analysis), morphological (SEM analysis). Finally, the cytotoxicity tests using in vitro method were conducted. The obtained results exhibited that the dual release system allows for the gradual and prolonged delivery of the active substances, even up to 7 days.Keywords: wound dressings, SA/PVA hydrogels, nanocarrier-drug system, USP4 method
Procedia PDF Downloads 14814161 Experimental Investigation and Analysis of Wear Parameters on Al/Sic/Gr: Metal Matrix Hybrid Composite by Taguchi Method
Authors: Rachit Marwaha, Rahul Dev Gupta, Vivek Jain, Krishan Kant Sharma
Abstract:
Metal matrix hybrid composites (MMHCs) are now gaining their usage in aerospace, automotive and other industries because of their inherent properties like high strength to weight ratio, hardness and wear resistance, good creep behaviour, light weight, design flexibility and low wear rate etc. Al alloy base matrix reinforced with silicon carbide (10%) and graphite (5%) particles was fabricated by stir casting process. The wear and frictional properties of metal matrix hybrid composites were studied by performing dry sliding wear test using pin on disc wear test apparatus. Experiments were conducted based on the plan of experiments generated through Taguchi’s technique. A L9 Orthogonal array was selected for analysis of data. Investigation to find the influence of applied load, sliding speed and track diameter on wear rate as well as coefficient of friction during wearing process was carried out using ANOVA. Objective of the model was chosen as smaller the better characteristics to analyse the dry sliding wear resistance. Results show that track diameter has highest influence followed by load and sliding speed.Keywords: Taguchi method, orthogonal array, ANOVA, metal matrix hybrid composites
Procedia PDF Downloads 32914160 Partial Least Square Regression for High-Dimentional and High-Correlated Data
Authors: Mohammed Abdullah Alshahrani
Abstract:
The research focuses on investigating the use of partial least squares (PLS) methodology for addressing challenges associated with high-dimensional correlated data. Recent technological advancements have led to experiments producing data characterized by a large number of variables compared to observations, with substantial inter-variable correlations. Such data patterns are common in chemometrics, where near-infrared (NIR) spectrometer calibrations record chemical absorbance levels across hundreds of wavelengths, and in genomics, where thousands of genomic regions' copy number alterations (CNA) are recorded from cancer patients. PLS serves as a widely used method for analyzing high-dimensional data, functioning as a regression tool in chemometrics and a classification method in genomics. It handles data complexity by creating latent variables (components) from original variables. However, applying PLS can present challenges. The study investigates key areas to address these challenges, including unifying interpretations across three main PLS algorithms and exploring unusual negative shrinkage factors encountered during model fitting. The research presents an alternative approach to addressing the interpretation challenge of predictor weights associated with PLS. Sparse estimation of predictor weights is employed using a penalty function combining a lasso penalty for sparsity and a Cauchy distribution-based penalty to account for variable dependencies. The results demonstrate sparse and grouped weight estimates, aiding interpretation and prediction tasks in genomic data analysis. High-dimensional data scenarios, where predictors outnumber observations, are common in regression analysis applications. Ordinary least squares regression (OLS), the standard method, performs inadequately with high-dimensional and highly correlated data. Copy number alterations (CNA) in key genes have been linked to disease phenotypes, highlighting the importance of accurate classification of gene expression data in bioinformatics and biology using regularized methods like PLS for regression and classification.Keywords: partial least square regression, genetics data, negative filter factors, high dimensional data, high correlated data
Procedia PDF Downloads 4914159 Natural Frequency Analysis of Spinning Functionally Graded Cylindrical Shells Subjected to Thermal Loads
Authors: Esmaeil Bahmyari
Abstract:
The natural frequency analysis of the functionally graded (FG) rotating cylindrical shells subjected to thermal loads is studied based on the three-dimensional elasticity theory. The temperature-dependent assumption of the material properties is graded in the thickness direction, which varies based on the simple power law distribution. The governing equations and the appropriate boundary conditions, which include the effects of initial thermal stresses, are derived employing Hamilton’s principle. The initial thermo-mechanical stresses are obtained by the thermo-elastic equilibrium equation’s solution. As an efficient and accurate numerical tool, the differential quadrature method (DQM) is adopted to solve the thermo-elastic equilibrium equations, free vibration equations and natural frequencies are obtained. The high accuracy of the method is demonstrated by comparison studies with those existing solutions in the literature. Ultimately, the parametric studies are performed to demonstrate the effects of boundary conditions, temperature rise, material graded index, the thickness-to-length and the aspect ratios for the rotating cylindrical shells on the natural frequency.Keywords: free vibration, DQM, elasticity theory, FG shell, rotating cylindrical shell
Procedia PDF Downloads 8414158 Estimation of Carbon Losses in Rice: Wheat Cropping System of Punjab, Pakistan
Authors: Saeed Qaisrani
Abstract:
The study was conducted to observe carbon and nutrient loss by burning of rice residues on rice-wheat cropping system The rice crop was harvested to conduct the experiment in a randomized complete block design (RCBD) with factors and 4 replications with a net plot size of 10 m x 20 m. Rice stubbles were managed by two methods i.e. Incorporation & burning of rice residues. Soil samples were taken to a depth of 30 cm before sowing & after harvesting of wheat. Wheat was sown after harvesting of rice by three practices i.e. Conventional tillage, Minimum tillage and Zero tillage to observe best tillage practices. Laboratory and field experiments were conducted on wheat to assess best tillage practice and residues management method with estimation of carbon losses. Data on the following parameters; establishment count, plant height, spike length, number of grains per spike, biological yield, fat content, carbohydrate content, protein content, and harvest index were recorded to check wheat quality & ensuring food security in the region. Soil physico-chemical analysis i.e. pH, electrical conductivity, organic matter, nitrogen, phosphorus, potassium, and carbon were done in soil fertility laboratory. Substantial results were found on growth, yield and related parameters of wheat crop. The collected data were examined statistically with economic analysis to estimate the cost-benefit ratio of using different tillage techniques and residue management practices. Obtained results depicted that Zero tillage method have positive impacts on growth, yield and quality of wheat, Moreover, it is cost effective methodology. Similarly, Incorporation is suitable and beneficial method for soil due to more nutrients provision and reduce the need of fertilizers. Burning of rice stubbles has negative impact including air pollution, nutrient loss, microbes died and carbon loss. Recommended the zero tillage technology to reduce carbon losses along with food security in Pakistan.Keywords: agricultural agronomy, food security, carbon sequestration, rice-wheat cropping system
Procedia PDF Downloads 27714157 Statistical Data Analysis of Migration Impact on the Spread of HIV Epidemic Model Using Markov Monte Carlo Method
Authors: Ofosuhene O. Apenteng, Noor Azina Ismail
Abstract:
Over the last several years, concern has developed over how to minimize the spread of HIV/AIDS epidemic in many countries. AIDS epidemic has tremendously stimulated the development of mathematical models of infectious diseases. The transmission dynamics of HIV infection that eventually developed AIDS has taken a pivotal role of much on building mathematical models. From the initial HIV and AIDS models introduced in the 80s, various improvements have been taken into account as how to model HIV/AIDS frameworks. In this paper, we present the impact of migration on the spread of HIV/AIDS. Epidemic model is considered by a system of nonlinear differential equations to supplement the statistical method approach. The model is calibrated using HIV incidence data from Malaysia between 1986 and 2011. Bayesian inference based on Markov Chain Monte Carlo is used to validate the model by fitting it to the data and to estimate the unknown parameters for the model. The results suggest that the migrants stay for a long time contributes to the spread of HIV. The model also indicates that susceptible individual becomes infected and moved to HIV compartment at a rate that is more significant than the removal rate from HIV compartment to AIDS compartment. The disease-free steady state is unstable since the basic reproduction number is 1.627309. This is a big concern and not a good indicator from the public heath point of view since the aim is to stabilize the epidemic at the disease equilibrium.Keywords: epidemic model, HIV, MCMC, parameter estimation
Procedia PDF Downloads 60114156 Formulation and Evaluation of Lisinopril Microspheres for Nasal Delivery
Authors: S. S. Patil, R. M. Mhetre, S. V. Patil
Abstract:
Lisinopril is an angiotensin converting enzyme inhibitor used in the treatment of hypertension and heart failure in prophylactic treatment after myocardial infarction and in diabetic nephropathy. However, it is very poorly absorbed from gastro-intestinal tract. Intranasal administration is an ideal alternative to the parenteral route for systemic drug delivery. Formulating multiparticulate system with mucoadhesive polymers provide a significant increase in the nasal residence time. The aim of the present approach was to overcome the drawbacks of the conventional dosage forms of lisinopril by formulating intranasal microspheres with Carbopol 974P NF and HPMC K4 M along with film forming polymer ethyl cellulose.The microspheres were prepared by emulsion solvent evaporation method. The prepared microspheres were characterized for encapsulation efficiency, drug loading, particle size, and surface morphology, degree of swelling, ex vivo mucoadhesion, drug release, ex vivo diffusion studies. All formulations has shown entrapment efficiency between 80 to more than 95%, mucoadhesion was more than 80 % and drug release up to 90 %. Ex vivo studies revealed tht the improved bioavailability of drug compared to oral drug administration. Both in vitro and in vivo studies conclude that combination of Carbopol and HPMC based microspheres shown better results than single carbopol based microspheres for the delivery of lisinopril.Keywords: microspheres, lisinopril, nasal delivery, solvent evaporation method
Procedia PDF Downloads 52814155 Response of First Bachelor of Medicine, Bachelor of Surgery (MBBS) Students to Integrated Learning Program
Authors: Raveendranath Veeramani, Parkash Chand, H. Y. Suma, A. Umamageswari
Abstract:
Background and Aims: The aim of this study was to evaluate students’ perception of Integrated Learning Program[ILP]. Settings and Design: A questionnaire was used to survey and evaluate the perceptions of 1styear MBBS students at the Department of Anatomy at our medical college in India. Materials and Methods: The first MBBS Students of Anatomy were involved in the ILP on the Liver and extra hepatic biliary apparatus integrating the Departments of Anatomy, Biochemistry and Hepato-biliary Surgery. The evaluation of the ILP was done by two sets of short questionnaire that had ten items using the Likert five-point grading scale. The data involved both the students’ responses and their grading. Results: A majority of students felt that the ILP was better in as compared to the traditional lecture method of teaching.The integrated teaching method was better at fulfilling learning objectives (128 students, 83%), enabled better understanding (students, 94%), were more interesting (140 students, 90%), ensured that they could score better in exams (115 students, 77%) and involved greater interaction (100 students, 66%), as compared to traditional teaching methods. Most of the students (142 students, 95%) opined that more such sessions should be organized in the future. Conclusions: Responses from students show that the integrated learning session should be incorporated even at first phase of MBBS for selected topics so as to create interest in the medical sciences at the entry level and to make them understand the importance of basic science.Keywords: integrated learning, students response, vertical integration, horizontal integration
Procedia PDF Downloads 20114154 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution
Authors: Najrullah Khan, Athar Ali Khan
Abstract:
The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation
Procedia PDF Downloads 53514153 Terrorism and Sustainable Tourism Development
Authors: P. Okoro Ugo Chigozie, P. A. Igbojekwe, E. N. Ukabuilu
Abstract:
Tourism and terrorism experiences are best viewed as dynamic, complex systems with extreme diverse consequences on any nation’s economy. Tourism is one of the biggest industries in the world and one of the economical sectors which grows rapidly; tourism has positive impact on the nation’s economy. Terrorism is the method or the theory behind the method whereby an organized group or party seeks to achieve its avowed aims chiefly through the systematic use of violence; the consequences of terrorism on tourist destinations are inescapable and can be profound. Especially, it threatens the attractiveness of a tourist destination and strips the competitiveness of that destination. Destination’s vulnerability to politically motivated violence not only retracts tourists, but threatens sustainable tourism development. This paper examines the activities of the Jamaata Ahlis Sunna Liddaawati -an Islamic sect popularly known as Boko Haram – and its impact on sustainable tourism development in the Nigeria state. Possible triggers of this insurgency and potentially evolving measure against its influence on sustainable tourism including, strong image management of the tourism industry, feasible tourist safety policy, viable anti-terrorism measures, proactive respond to the challenge of terrorism, reinforcement of the legitimate frameworks and irrevocable penalty against menace of corruption; are discussed in this paper, as limiting the effects of insurgency on the attractiveness of Nigeria as safe tourists destination.Keywords: Nigeria, terrorism, sustainable tourism development, corruption and competitiveness
Procedia PDF Downloads 62114152 Introduction of Artificial Intelligence for Estimating Fractal Dimension and Its Applications in the Medical Field
Authors: Zerroug Abdelhamid, Danielle Chassoux
Abstract:
Various models are given to simulate homogeneous or heterogeneous cancerous tumors and extract in each case the boundary. The fractal dimension is then estimated by least squares method and compared to some previous methods.Keywords: simulation, cancerous tumor, Markov fields, fractal dimension, extraction, recovering
Procedia PDF Downloads 36514151 Numerical Investigation of Beam-Columns Subjected to Non-Proportional Loadings under Ambient Temperature Conditions
Authors: George Adomako Kumi
Abstract:
The response of structural members, when subjected to various forms of non-proportional loading, plays a major role in the overall stability and integrity of a structure. This research seeks to present the outcome of a finite element investigation conducted by the use of finite element programming software ABAQUS to validate the experimental results of elastic and inelastic behavior and strength of beam-columns subjected to axial loading, biaxial bending, and torsion under ambient temperature conditions. The application of the rigorous and highly complicated ABAQUS finite element software will seek to account for material, non-linear geometry, deformations, and, more specifically, the contact behavior between the beam-columns and support surfaces. Comparisons of the three-dimensional model with the results of actual tests conducted and results from a solution algorithm developed through the use of the finite difference method will be established in order to authenticate the veracity of the developed model. The results of this research will seek to provide structural engineers with much-needed knowledge about the behavior of steel beam columns and their response to various non-proportional loading conditions under ambient temperature conditions.Keywords: beam-columns, axial loading, biaxial bending, torsion, ABAQUS, finite difference method
Procedia PDF Downloads 180