Search results for: regional anesthesia-analgesia techniques
6653 Advancing the Hi-Tech Ecosystem in the Periphery: The Case of the Sea of Galilee Region
Authors: Yael Dubinsky, Orit Hazzan
Abstract:
There is a constant need for hi-tech innovation to be decentralized to peripheral regions. This work describes how we applied design science research (DSR) principles to define what we refer to as the Sea of Galilee (SoG) method. The goal of the SoG method is to harness existing and new technological initiatives in peripheral regions to create a socio-technological network that can initiate and maintain hi-tech activities. The SoG method consists of a set of principles, a stakeholder network, and actual hi-tech business initiatives, including their infrastructure and practices. The three cycles of DSR, the Relevance, Design, and Rigor cycles, layout a research framework to sharpen the requirements, collect data from case studies, and iteratively refine the SoG method based on the existing knowledge base. We propose that the SoG method can be deployed by regional authorities that wish to be considered smart regions (an extension of the notion of smart cities).Keywords: design science research, socio-technological initiatives, Sea of Galilee method, periphery stakeholder network, hi-tech initiatieves
Procedia PDF Downloads 1316652 Influence of Surface Preparation Effects on the Electrochemical Behavior of 2098-T351 Al–Cu–Li Alloy
Authors: Rejane Maria P. da Silva, Mariana X. Milagre, João Victor de S. Araujo, Leandro A. de Oliveira, Renato A. Antunes, Isolda Costa
Abstract:
The Al-Cu-Li alloys are advanced materials for aerospace application because of their interesting mechanical properties and low density when compared with conventional Al-alloys. However, Al-Cu-Li alloys are susceptible to localized corrosion. The near-surface deformed layer (NSDL) induced by the rolling process during the production of the alloy and its removal by polishing can influence on the corrosion susceptibility of these alloys. In this work, the influence of surface preparation effects on the electrochemical activity of AA2098-T351 (Al–Cu–Li alloy) was investigated using a correlation between surface chemistry, microstructure, and electrochemical activity. Two conditions were investigated, polished and as-received surfaces of the alloy. The morphology of the two types of surfaces was investigated using confocal laser scanning microscopy (CLSM) and optical microscopy. The surface chemistry was analyzed by X-ray Photoelectron Spectroscopy (XPS) and energy dispersive X-ray spectroscopy (EDS). Global electrochemical techniques (potentiodynamic polarization and EIS technique) and a local electrochemical technique (Localized Electrochemical Impedance Spectroscopy-LEIS) were used to examine the electrochemical activity of the surfaces. The results obtained in this study showed that in the as-received surface, the near-surface deformed layer (NSDL), which is composed of Mg-rich bands, influenced the electrochemical behavior of the alloy. The results showed higher electrochemical activity to the polished surface condition compared to the as-received one.Keywords: Al-Cu-Li alloys, surface preparation effects, electrochemical techniques, localized corrosion
Procedia PDF Downloads 1606651 Participation in IAEA Proficiency Test to Analyse Cobalt, Strontium and Caesium in Seawater Using Direct Counting and Radiochemical Techniques
Authors: S. Visetpotjanakit, C. Khrautongkieo
Abstract:
Radiation monitoring in the environment and foodstuffs is one of the main responsibilities of Office of Atoms for Peace (OAP) as the nuclear regulatory body of Thailand. The main goal of the OAP is to assure the safety of the Thai people and environment from any radiological incidents. Various radioanalytical methods have been developed to monitor radiation and radionuclides in the environmental and foodstuff samples. To validate our analytical performance, several proficiency test exercises from the International Atomic Energy Agency (IAEA) have been performed. Here, the results of a proficiency test exercise referred to as the Proficiency Test for Tritium, Cobalt, Strontium and Caesium Isotopes in Seawater 2017 (IAEA-RML-2017-01) are presented. All radionuclides excepting ³H were analysed using various radioanalytical methods, i.e. direct gamma-ray counting for determining ⁶⁰Co, ¹³⁴Cs and ¹³⁷Cs and developed radiochemical techniques for analysing ¹³⁴Cs, ¹³⁷Cs using AMP pre-concentration technique and 90Sr using di-(2-ethylhexyl) phosphoric acid (HDEHP) liquid extraction technique. The analysis results were submitted to IAEA. All results passed IAEA criteria, i.e. accuracy, precision and trueness and obtained ‘Accepted’ statuses. These confirm the data quality from the OAP environmental radiation laboratory to monitor radiation in the environment.Keywords: international atomic energy agency, proficiency test, radiation monitoring, seawater
Procedia PDF Downloads 1726650 Effects of Auxetic Antibacterial Zwitterion Carboxylate and Sulfate Copolymer Hydrogels for Diabetic Wound Healing Application
Authors: Udayakumar Vee, Franck Quero
Abstract:
Zwitterionic polymers generally have been viewed as a new class of antimicrobial and non-fouling materials. They offer a broad versatility for chemical modification and hence great freedom for accurate molecular design, which bear an equimolar number of homogenously distributed anionic and cationic groups along their polymer chains. This study explores the effectiveness of the auxetic zwitterion carboxylate/sulfonate hydrogel in the diabetic-induced mouse model. A series of silver metal-doped auxetic zwitterion carboxylate/sulfonate/vinylaniline copolymer hydrogels is designed via a 3D printer. Zwitterion monomers have been characterized by FT-IR and NMR techniques. The effect of changing the monomers and different loading ratios of Ag over zwitterion on the final hydrogel materials' antimicrobial properties and biocompatibility will be investigated in detail. The synthesized auxetic hydrogel has been characterized using a wide range of techniques to help establish the relationship between molecular level and macroscopic properties of these materials, including mechanical and antibacterial and biocompatibility and wound healing ability. This work's comparative studies and results provide new insights and guide us in choosing a better auxetic structured material for a broad spectrum of wound healing applications in the animal model. We expect this approach to provide a versatile and robust platform for biomaterial design that could lead to promising treatments for wound healing applications.Keywords: auxetic, zwitterion, carboxylate, sulfonate, polymer, wound healing
Procedia PDF Downloads 1426649 A Grey-Box Text Attack Framework Using Explainable AI
Authors: Esther Chiramal, Kelvin Soh Boon Kai
Abstract:
Explainable AI is a strong strategy implemented to understand complex black-box model predictions in a human-interpretable language. It provides the evidence required to execute the use of trustworthy and reliable AI systems. On the other hand, however, it also opens the door to locating possible vulnerabilities in an AI model. Traditional adversarial text attack uses word substitution, data augmentation techniques, and gradient-based attacks on powerful pre-trained Bidirectional Encoder Representations from Transformers (BERT) variants to generate adversarial sentences. These attacks are generally white-box in nature and not practical as they can be easily detected by humans e.g., Changing the word from “Poor” to “Rich”. We proposed a simple yet effective Grey-box cum Black-box approach that does not require the knowledge of the model while using a set of surrogate Transformer/BERT models to perform the attack using Explainable AI techniques. As Transformers are the current state-of-the-art models for almost all Natural Language Processing (NLP) tasks, an attack generated from BERT1 is transferable to BERT2. This transferability is made possible due to the attention mechanism in the transformer that allows the model to capture long-range dependencies in a sequence. Using the power of BERT generalisation via attention, we attempt to exploit how transformers learn by attacking a few surrogate transformer variants which are all based on a different architecture. We demonstrate that this approach is highly effective to generate semantically good sentences by changing as little as one word that is not detectable by humans while still fooling other BERT models.Keywords: BERT, explainable AI, Grey-box text attack, transformer
Procedia PDF Downloads 1386648 The Clustering of Multiple Sclerosis Subgroups through L2 Norm Multifractal Denoising Technique
Authors: Yeliz Karaca, Rana Karabudak
Abstract:
Multifractal Denoising techniques are used in the identification of significant attributes by removing the noise of the dataset. Magnetic resonance (MR) image technique is the most sensitive method so as to identify chronic disorders of the nervous system such as Multiple Sclerosis. MRI and Expanded Disability Status Scale (EDSS) data belonging to 120 individuals who have one of the subgroups of MS (Relapsing Remitting MS (RRMS), Secondary Progressive MS (SPMS), Primary Progressive MS (PPMS)) as well as 19 healthy individuals in the control group have been used in this study. The study is comprised of the following stages: (i) L2 Norm Multifractal Denoising technique, one of the multifractal technique, has been used with the application on the MS data (MRI and EDSS). In this way, the new dataset has been obtained. (ii) The new MS dataset obtained from the MS dataset and L2 Multifractal Denoising technique has been applied to the K-Means and Fuzzy C Means clustering algorithms which are among the unsupervised methods. Thus, the clustering performances have been compared. (iii) In the identification of significant attributes in the MS dataset through the Multifractal denoising (L2 Norm) technique using K-Means and FCM algorithms on the MS subgroups and control group of healthy individuals, excellent performance outcome has been yielded. According to the clustering results based on the MS subgroups obtained in the study, successful clustering results have been obtained in the K-Means and FCM algorithms by applying the L2 norm of multifractal denoising technique for the MS dataset. Clustering performance has been more successful with the MS Dataset (L2_Norm MS Data Set) K-Means and FCM in which significant attributes are obtained by applying L2 Norm Denoising technique.Keywords: clinical decision support, clustering algorithms, multiple sclerosis, multifractal techniques
Procedia PDF Downloads 1716647 Revisiting Politics of Religion in Muslim Republics of Former Soviet Union and Rise of Extremism, Global Jihadi Terrorism
Authors: Etibar Guliyev
Abstract:
The breakdown of the Soviet Union in 1991 has led to a considerable rise in the religious self-consciousness of Muslim population of the Central Asia. Additionally, huge amount of money spent by various states further facilitated the spread of religious ideas. According to some sources, Saudi Arabia spent 87 billion dollars to propagate Wahhabism abroad during two decades, whereas the Communist Party of the Soviet Union spent just over 7 billion dollars to spread its ideology worldwide between 1921 and 1991. As the result, today once a remote area from international politics has turned into third major source of recruitment of fighters for global terrorist organizations. In order to illustrate to scope of the involvement of the Central Asian residents in international terrorist networks it is enough to mention the name of Colonel Gulmorod Khalimov, the former head of the Tajik special police forces who served as ISIS war minister between 2016 and 2017. The importance of the topic stems from the fact that the above-mentioned republics with a territory of 4 million square km and the population of around 80 million people borders Russia, Iran Afghanistan and China. Moreover, the fact that political and military activities motivated with religious feelings in those countries have implications not only for domestic but also for regional and global political relations and all of them has root in politics of religions adds value to the research. This research aims to provide an in-depth analyses of the marked features of the state policies to regulate religious activities and approach this question both from individual, domestic, regional and global levels of analyses. The research will enable us to better understand what implications have the state of religious freedom in post-Soviet Muslim republics for international relations and the rise of global jihadi terrorism. The paper tries to find a linkage between the mentioned terror attacks and underground rise of religious extremism in Central Asia. This research is based on multiple research methods, mainly on qualitative one. The process tracing method is also employed to review religious policies implemented from 1918-1991 and after the collapse of the Soviet Union in a chronological way. In terms of the quantitative method, it chiefly will be used in a bid to process various statistics disseminated in academic and official sources. The research mostly explored constructivist, securitization and social movement theories. Findings of the research suggests that the endemic problems peculiar to authoritarian regimes of Central Asia such as crackdown on the expression of religious believe and any kind of opposition, economic decline, instrumental use of religion and corruption and tribalism further accelerated the recruitment problem. Paper also concludes that the Central Asian states in some cases misused counter-terrorism campaign as a pretext to further restrict freedom of faith in their respective countries.Keywords: identity, political Islam, religious extremism, security, terrorism
Procedia PDF Downloads 2686646 Antibacterial Zwitterion Carboxylate and Sulfonate Copolymer Auxetic Hydrogels for Diabetic Wound Healing Application
Authors: Udayakumar Veerabagu, Franck Quero
Abstract:
Zwitterion carboxylate and sulfonate polymers generally have been viewed as a new class of antimicrobial and non-fouling materials. They offer a broad versatility for chemical modification and hence great freedom for accurate molecular design, which bear an equimolar number of homogenously distributed anionic and cationic groups along their polymer chains. This study explores the effectiveness of the auxetic zwitterion carboxylate/sulfonate hydrogel in the diabetic-induced mouse model. A series of silver metal-doped auxetic zwitterion carboxylate/sulfonate/vinylaniline copolymer hydrogels is designed via a 3D printer. Zwitterion monomers have been characterized by FT-IR and NMR techniques. The effect of changing the monomers and different loading ratios of Ag over zwitterion on the final hydrogel materials' antimicrobial properties and biocompatibility will be investigated in detail. The synthesized auxetic hydrogel has been characterized using a wide range of techniques to help establish the relationship between molecular level and macroscopic properties of these materials, including mechanical and antibacterial and biocompatibility and wound healing ability. This work's comparative studies and results provide new insights and guide us in choosing a better auxetic structured material for a broad spectrum of wound healing applications in the animal model. We expect this approach to provide a versatile and robust platform for biomaterial design that could lead to promising treatments for wound healing applications.Keywords: auxetic, zwitterion, carboxylate, sulfonate, polymer, wound healing
Procedia PDF Downloads 1576645 Process Monitoring Based on Parameterless Self-Organizing Map
Authors: Young Jae Choung, Seoung Bum Kim
Abstract:
Statistical Process Control (SPC) is a popular technique for process monitoring. A widely used tool in SPC is a control chart, which is used to detect the abnormal status of a process and maintain the controlled status of the process. Traditional control charts, such as Hotelling’s T2 control chart, are effective techniques to detect abnormal observations and monitor processes. However, many complicated manufacturing systems exhibit nonlinearity because of the different demands of the market. In this case, the unregulated use of a traditional linear modeling approach may not be effective. In reality, many industrial processes contain the nonlinear and time-varying properties because of the fluctuation of process raw materials, slowing shift of the set points, aging of the main process components, seasoning effects, and catalyst deactivation. The use of traditional SPC techniques with time-varying data will degrade the performance of the monitoring scheme. To address these issues, in the present study, we propose a parameterless self-organizing map (PLSOM)-based control chart. The PLSOM-based control chart not only can manage a situation where the distribution or parameter of the target observations changes, but also address the nonlinearity of modern manufacturing systems. The control limits of the proposed PLSOM chart are established by estimating the empirical level of significance on the percentile using a bootstrap method. Experimental results with simulated data and actual process data from a thin-film transistor-liquid crystal display process demonstrated the effectiveness and usefulness of the proposed chart.Keywords: control chart, parameter-less self-organizing map, self-organizing map, time-varying property
Procedia PDF Downloads 2776644 Pellegrini-Stieda Syndrome: A Physical Medicine and Rehabilitation Approach
Authors: Pedro Ferraz-Gameiro
Abstract:
Introduction: The Pellegrini-Stieda lesion is the result of post-traumatic calcification and/or ossification on the medial collateral ligament (MCL) of the knee. When this calcification is accompanied by gonalgia and limitation of knee flexion, it is called Pellegrini-Stieda syndrome. The pathogenesis is probably the calcification of a post-traumatic hematoma at least three weeks after the initial trauma or secondary to repetitive microtrauma. On anteroposterior radiographs, a Pellegrini-Stieda lesion is a linear vertical ossification or calcification of the proximal portion of the MCL and usually near the medial femoral condyle. Patients with Pellegrini-Stieda syndrome present knee pain associated with loss of range of motion. The treatment is usually conservative with analgesic and anti-inflammatory drugs, either systemic or intra-articular. Physical medicine and rehabilitation techniques associated with shock wave therapy can be a way of reduction of pain/inflammation. Patients who maintain instability with significant limitation of knee mobility may require surgical excision. Methods: Research was done using PubMed central using the terms Pellegrini-Stieda syndrome. Discussion/conclusion: Medical treatment is the rule, with initial rest, anti-inflammatory, and physiotherapy. If left untreated, this ossification can potentially form a significant bone mass, which can compromise the range of motion of the knee. Physical medicine and rehabilitation techniques associated with shock wave therapy are a way of reduction of pain/inflammation.Keywords: knee, Pellegrini-Stieda syndrome, rehabilitation, shock waves therapy
Procedia PDF Downloads 1426643 Degradation Model for UK Railway Drainage System
Authors: Yiqi Wu, Simon Tait, Andrew Nichols
Abstract:
Management of UK railway drainage assets is challenging due to the large amounts of historical assets with long asset life cycles. A major concern for asset managers is to maintain the required performance economically and efficiently while complying with the relevant regulation and legislation. As the majority of the drainage assets are buried underground and are often difficult or costly to examine, it is important for asset managers to understand and model the degradation process in order to foresee the upcoming reduction in asset performance and conduct proactive maintenance accordingly. In this research, a Markov chain approach is used to model the deterioration process of rail drainage assets. The study is based on historical condition scores and characteristics of drainage assets across the whole railway network in England, Scotland, and Wales. The model is used to examine the effect of various characteristics on the probabilities of degradation, for example, the regional difference in probabilities of degradation, and how material and shape can influence the deterioration process for chambers, channels, and pipes.Keywords: deterioration, degradation, markov models, probability, railway drainage
Procedia PDF Downloads 2266642 The Application and Relevance of Costing Techniques in Service-Oriented Business Organizations a Review of the Activity-Based Costing (ABC) Technique
Authors: Udeh Nneka Evelyn
Abstract:
The shortcoming of traditional costing system in terms of validity, accuracy, consistency, and Relevance increased the need for modern management accounting system. Activity –Based Costing (ABC) can be used as a modern tool for planning, Control and decision making for management. Past studies on ABC system have focused on manufacturing firms thereby making the studies on service firms scanty to some extent. This paper reviewed the application and relevance of activity-based costing technique in service oriented business organizations by employing a qualitative research method which relied heavily on literature review of past and current relevant articles focusing on ABC. Findings suggest that ABC is not only appropriate for use in a manufacturing environment; it is also most appropriate for service organizations such as financial institutions, the healthcare industry and government organization. In fact, some banking and financial institutions have been applying the concept for years under other names. One of them is unit costing, which is used to calculate the cost of banking services by determining the cost and consumption of each unit of output of functions required to deliver the service. ABC in very basic terms may provide very good payback for businesses. Some of the benefits that relate directly to the financial services industry are: identification the most profitable customers: more accurate product and service pricing: increase product profitability: Well organized process costs.Keywords: business, costing, organizations, planning, techniques
Procedia PDF Downloads 2416641 Towards a Better Understanding of Planning for Urban Intensification: Case Study of Auckland, New Zealand
Authors: Wen Liu, Errol Haarhoff, Lee Beattie
Abstract:
In 2010, New Zealand’s central government re-organise the local governments arrangements in Auckland, New Zealand by amalgamating its previous regional council and seven supporting local government units into a single unitary council, the Auckland Council. The Auckland Council is charged with providing local government services to approximately 1.5 million people (a third of New Zealand’s total population). This includes addressing Auckland’s strategic urban growth management and setting its urban planning policy directions for the next 40 years. This is expressed in the first ever spatial plan in the region – the Auckland Plan (2012). The Auckland plan supports implementing a compact city model by concentrating the larger part of future urban growth and development in, and around, existing and proposed transit centres, with the intention of Auckland to become globally competitive city and achieving ‘the most liveable city in the world’. Turning that vision into reality is operatized through the statutory land use plan, the Auckland Unitary Plan. The Unitary plan replaced the previous regional and local statutory plans when it became operative in 2016, becoming the ‘rule book’ on how to manage and develop the natural and built environment, using land use zones and zone standards. Common to the broad range of literature on urban growth management, one significant issue stands out about intensification. The ‘gap’ between strategic planning and what has been achieved is evident in the argument for the ‘compact’ urban form. Although the compact city model may have a wide range of merits, the extent to which these are actualized largely rely on how intensification actually is delivered. The transformation of the rhetoric of the residential intensification model into reality is of profound influence, yet has enjoyed limited empirical analysis. In Auckland, the establishment of the Auckland Plan set up the strategies to deliver intensification into diversified arenas. Nonetheless, planning policy itself does not necessarily achieve the envisaged objectives, delivering the planning system and high capacity to enhance and sustain plan implementation is another demanding agenda. Though the Auckland Plan provides a wide ranging strategic context, its actual delivery is beholden on the Unitary Plan. However, questions have been asked if the Unitary Plan has the necessary statutory tools to deliver the Auckland Plan’s policy outcomes. In Auckland, there is likely to be continuing tension between the strategies for intensification and their envisaged objectives, and made it doubtful whether the main principles of the intensification strategies could be realized. This raises questions over whether the Auckland Plan’s policy goals can be achieved in practice, including delivering ‘quality compact city’ and residential intensification. Taking Auckland as an example of traditionally sprawl cities, this article intends to investigate the efficacy plan making and implementation directed towards higher density development. This article explores the process of plan development, plan making and implementation frameworks of the first ever spatial plan in Auckland, so as to explicate the objectives and processes involved, and consider whether this will facilitate decision making processes to realize the anticipated intensive urban development.Keywords: urban intensification, sustainable development, plan making, governance and implementation
Procedia PDF Downloads 5576640 Quantitative Analysis of Contract Variations Impact on Infrastructure Project Performance
Authors: Soheila Sadeghi
Abstract:
Infrastructure projects often encounter contract variations that can significantly deviate from the original tender estimates, leading to cost overruns, schedule delays, and financial implications. This research aims to quantitatively assess the impact of changes in contract variations on project performance by conducting an in-depth analysis of a comprehensive dataset from the Regional Airport Car Park project. The dataset includes tender budget, contract quantities, rates, claims, and revenue data, providing a unique opportunity to investigate the effects of variations on project outcomes. The study focuses on 21 specific variations identified in the dataset, which represent changes or additions to the project scope. The research methodology involves establishing a baseline for the project's planned cost and scope by examining the tender budget and contract quantities. Each variation is then analyzed in detail, comparing the actual quantities and rates against the tender estimates to determine their impact on project cost and schedule. The claims data is utilized to track the progress of work and identify deviations from the planned schedule. The study employs statistical analysis using R to examine the dataset, including tender budget, contract quantities, rates, claims, and revenue data. Time series analysis is applied to the claims data to track progress and detect variations from the planned schedule. Regression analysis is utilized to investigate the relationship between variations and project performance indicators, such as cost overruns and schedule delays. The research findings highlight the significance of effective variation management in construction projects. The analysis reveals that variations can have a substantial impact on project cost, schedule, and financial outcomes. The study identifies specific variations that had the most significant influence on the Regional Airport Car Park project's performance, such as PV03 (additional fill, road base gravel, spray seal, and asphalt), PV06 (extension to the commercial car park), and PV07 (additional box out and general fill). These variations contributed to increased costs, schedule delays, and changes in the project's revenue profile. The study also examines the effectiveness of project management practices in managing variations and mitigating their impact. The research suggests that proactive risk management, thorough scope definition, and effective communication among project stakeholders can help minimize the negative consequences of variations. The findings emphasize the importance of establishing clear procedures for identifying, assessing, and managing variations throughout the project lifecycle. The outcomes of this research contribute to the body of knowledge in construction project management by demonstrating the value of analyzing tender, contract, claims, and revenue data in variation impact assessment. However, the research acknowledges the limitations imposed by the dataset, particularly the absence of detailed contract and tender documents. This constraint restricts the depth of analysis possible in investigating the root causes and full extent of variations' impact on the project. Future research could build upon this study by incorporating more comprehensive data sources to further explore the dynamics of variations in construction projects.Keywords: contract variation impact, quantitative analysis, project performance, claims analysis
Procedia PDF Downloads 426639 The Urban Project: Metropolization Tool and Sustainability Vector - Case of Constantine
Authors: Mouhoubi Nedjima, Sassi Boudemagh Souad, Chouabbia Khedidja
Abstract:
Cities grow, large or small; they seek to gain a place in the market competition, which talks to sell a product that is the city itself. The metropolis are large cities enjoying a legal status and assets providing their dominions elements on a territory larger than their range, do not escape this situation. Thus, the search for promising tool metropolises better development and durability meet the challenges as economic, social and environmental is timely. The urban project is a new way to build the city; it is involved in the metropolises of two ways, either to manage the crisis and to meet the internal needs of the metropolis, or by creating a regional attractiveness with their potential. This communication will address the issue of urban project as a tool that has and should find a place in the panoply of existing institutional tools. Based on the example of the modernization project of the metropolis of eastern Algeria "Constantine", we will examine what the urban project can bring to a city, the extent of its impact but also the relationship between the visions actors so metropolization a success.Keywords: urban project, metropolis, institutional tools, Constantine
Procedia PDF Downloads 4036638 Synthesis, Structural, Spectroscopic and Nonlinear Optical Properties of New Picolinate Complex of Manganese (II) Ion
Authors: Ömer Tamer, Davut Avcı, Yusuf Atalay
Abstract:
Novel picolinate complex of manganese(II) ion, [Mn(pic)2] [pic: picolinate or 2-pyridinecarboxylate], was prepared and fully characterized by single crystal X-ray structure determination. The manganese(II) complex was characterized by FT-IR, FT-Raman and UV–Vis spectroscopic techniques. The C=O, C=N and C=C stretching vibrations were found to be strong and simultaneously active in IR and spectra. In order to support these experimental techniques, density functional theory (DFT) calculations were performed at Gaussian 09W. Although the supramolecular interactions have some influences on the molecular geometry in solid state phase, the calculated data show that the predicted geometries can reproduce the structural parameters. The molecular modeling and calculations of IR, Raman and UV-vis spectra were performed by using DFT levels. Nonlinear optical (NLO) properties of synthesized complex were evaluated by the determining of dipole moment (µ), polarizability (α) and hyperpolarizability (β). Obtained results demonstrated that the manganese(II) complex is a good candidate for NLO material. Stability of the molecule arising from hyperconjugative interactions and charge delocalization was analyzed using natural bond orbital (NBO) analysis. The highest occupied and the lowest unoccupied molecular orbitals (HOMO and LUMO) which is also known the frontier molecular orbitals were simulated, and obtained energy gap confirmed that charge transfer occurs within manganese(II) complex. Molecular electrostatic potential (MEP) for synthesized manganese(II) complex displays the electrophilic and nucleophilic regions. From MEP, the the most negative region is located over carboxyl O atoms while positive region is located over H atoms.Keywords: DFT, picolinate, IR, Raman, nonlinear optic
Procedia PDF Downloads 5006637 Scientific and Technical Basis for the Application of Textile Structures in Glass Using Pate De Verre Technique
Authors: Walaa Hamed Mohamed Hamza
Abstract:
Textile structures are the way in which the threading process of both thread and loom is done together to form the woven. Different methods of attaching the clothing and the flesh produce different textile structures, which differ in their surface appearance from each other, including so-called simple textile structures. Textile compositions are the basis of woven fabric, through which aesthetic values can be achieved in the textile industry by weaving threads of yarn with the weft at varying degrees that may reach the total control of one of the two groups on the other. Hence the idea of how art and design can be used using different textile structures under the modern techniques of pate de verre. In the creation of designs suitable for glass products employed in the interior architecture. The problem of research: The textile structures, in general, have a significant impact on the appearance of the fabrics in terms of form and aesthetic. How can we benefit from the characteristics of different textile compositions in different glass designs with different artistic values. The research achieves its goal by the investment of simple textile structures in innovative artistic designs using the pate de verre technique, as well as the use of designs resulting from the textile structures in the external architecture to add various aesthetic values. The importance of research in the revival of heritage using ancient techniques, as well as synergy between different fields of applied arts such as glass and textile, and also study the different and diverse effects resulting from each fabric composition and the possibility of use in various designs in the interior architecture. The research will be achieved that by investing in simple textile compositions, innovative artistic designs produced using pate de verre technology can be used in interior architecture.Keywords: glass, interior architecture, pate de verre, textile structures
Procedia PDF Downloads 2966636 Prevalence of Oral Mucosal Lesions in Malaysia: A Teaching Hospital Based Study
Authors: Renjith George Pallivathukal, Preethy Mary Donald
Abstract:
Asymptomatic oral lesions are often ignored by the patients and usually will be identified only in advanced stages. Early detection of precancerous lesions is important for better prognosis. It is also important for the oral health care person to be aware of the regional prevalence of oral lesions in order to provide early care for the same. We conducted a retrospective study to assess the prevalence of oral lesions based on the information available from patient records in a teaching dental school. Dental records of patients who attended the department of Oral medicine and diagnosis between September 2014 and September 2016 were retrieved and verified for oral lesions. Results: The ages of the patients ranged from 13 to 38 years with a mean age of 21.8 years. The lesions were classified as white (40.5%), red (23%), ulcerated (10.5%), pigmented (15.2%) and soft tissue enlargements (10.8%). 52% of the patients were unaware of the oral lesions before the dental visit. Overall, the prevalence of lesions in dental patients lower to national estimates, but the prevalence of some lesions showed variations.Keywords: oral mucosal lesion, pre-cancer, prevalence, soft tissue lesion
Procedia PDF Downloads 3516635 Numerical Modeling of Air Pollution with PM-Particles and Dust
Authors: N. Gigauri, A. Surmava, L. Intskirveli, V. Kukhalashvili, S. Mdivani
Abstract:
The subject of our study is atmospheric air pollution with numerical modeling. In the presented article, as the object of research, there is chosen city Tbilisi, the capital of Georgia, with a population of one and a half million and a difficult terrain. The main source of pollution in Tbilisi is currently vehicles and construction dust. The concentrations of dust and PM (Particulate Matter) were determined in the air of Tbilisi and in its vicinity. There are estimated their monthly maximum, minimum, and average concentrations. Processes of dust propagation in the atmosphere of the city and its surrounding territory are modelled using a 3D regional model of atmospheric processes and an admixture transfer-diffusion equation. There were taken figures of distribution of the polluted cloud and dust concentrations in different areas of the city at different heights and at different time intervals with the background stationary westward and eastward wind. It is accepted that the difficult terrain and mountain-bar circulation affect the deformation of the cloud and its spread, there are determined time periods when the dust concentration in the city is greater than MAC (Maximum Allowable Concentration, MAC=0.5 mg/m³).Keywords: air pollution, dust, numerical modeling, PM-particles
Procedia PDF Downloads 1416634 The Staphylococcus aureus Exotoxin Recognition Using Nanobiosensor Designed by an Antibody-Attached Nanosilica Method
Authors: Hamed Ahari, Behrouz Akbari Adreghani, Vadood Razavilar, Amirali Anvar, Sima Moradi, Hourieh Shalchi
Abstract:
Considering the ever increasing population and industrialization of the developmental trend of humankind's life, we are no longer able to detect the toxins produced in food products using the traditional techniques. This is due to the fact that the isolation time for food products is not cost-effective and even in most of the cases, the precision in the practical techniques like the bacterial cultivation and other techniques suffer from operator errors or the errors of the mixtures used. Hence with the advent of nanotechnology, the design of selective and smart sensors is one of the greatest industrial revelations of the quality control of food products that in few minutes time, and with a very high precision can identify the volume and toxicity of the bacteria. Methods and Materials: In this technique, based on the bacterial antibody connection to nanoparticle, a sensor was used. In this part of the research, as the basis for absorption for the recognition of bacterial toxin, medium sized silica nanoparticles of 10 nanometer in form of solid powder were utilized with Notrino brand. Then the suspension produced from agent-linked nanosilica which was connected to bacterial antibody was positioned near the samples of distilled water, which were contaminated with Staphylococcus aureus bacterial toxin with the density of 10-3, so that in case any toxin exists in the sample, a connection between toxin antigen and antibody would be formed. Finally, the light absorption related to the connection of antigen to the particle attached antibody was measured using spectrophotometry. The gene of 23S rRNA that is conserved in all Staphylococcus spp., also used as control. The accuracy of the test was monitored by using serial dilution (l0-6) of overnight cell culture of Staphylococcus spp., bacteria (OD600: 0.02 = 107 cell). It showed that the sensitivity of PCR is 10 bacteria per ml of cells within few hours. Result: The results indicate that the sensor detects up to 10-4 density. Additionally, the sensitivity of the sensors was examined after 60 days, the sensor by the 56 days had confirmatory results and started to decrease after those time periods. Conclusions: Comparing practical nano biosensory to conventional methods like that culture and biotechnology methods(such as polymerase chain reaction) is accuracy, sensitiveness and being unique. In the other way, they reduce the time from the hours to the 30 minutes.Keywords: exotoxin, nanobiosensor, recognition, Staphylococcus aureus
Procedia PDF Downloads 3876633 Opportunities and Optimization of the Our Eyes Initiative as the Strategy for Counter-Terrorism in ASEAN
Authors: Chastiti Mediafira Wulolo, Tri Legionosuko, Suhirwan, Yusuf
Abstract:
Terrorism and radicalization have become a common threat to every nation in this world. As a part of the asymmetric warfare threat, terrorism and radicalization need a complex strategy as the problem solver. One such way is by collaborating with the international community. The Our Eyes Initiative (OEI), for example, is a cooperation pact in the field of intelligence information exchanges related to terrorism and radicalization initiated by the Indonesian Ministry of Defence. The pact has been signed by Indonesia, Philippines, Malaysia, Brunei Darussalam, Thailand, and Singapore. This cooperation mostly engages military acts as a central role, but it still requires the involvement of various parties such as the police, intelligence agencies and other government institutions. This paper will use a qualitative content analysis method to address the opportunity and enhance the optimization of OEI. As the result, it will explain how OEI takes the opportunities as the strategy for counter-terrorism by building it up as the regional cooperation, building the legitimacy of government and creating the legal framework of the information sharing system.Keywords: our eyes initiative, terrorism, counter-terrorism, ASEAN, cooperation, strategy
Procedia PDF Downloads 1836632 Predictive Analysis of Chest X-rays Using NLP and Large Language Models with the Indiana University Dataset and Random Forest Classifier
Authors: Azita Ramezani, Ghazal Mashhadiagha, Bahareh Sanabakhsh
Abstract:
This study researches the combination of Random. Forest classifiers with large language models (LLMs) and natural language processing (NLP) to improve diagnostic accuracy in chest X-ray analysis using the Indiana University dataset. Utilizing advanced NLP techniques, the research preprocesses textual data from radiological reports to extract key features, which are then merged with image-derived data. This improved dataset is analyzed with Random Forest classifiers to predict specific clinical results, focusing on the identification of health issues and the estimation of case urgency. The findings reveal that the combination of NLP, LLMs, and machine learning not only increases diagnostic precision but also reliability, especially in quickly identifying critical conditions. Achieving an accuracy of 99.35%, the model shows significant advancements over conventional diagnostic techniques. The results emphasize the large potential of machine learning in medical imaging, suggesting that these technologies could greatly enhance clinician judgment and patient outcomes by offering quicker and more precise diagnostic approximations.Keywords: natural language processing (NLP), large language models (LLMs), random forest classifier, chest x-ray analysis, medical imaging, diagnostic accuracy, indiana university dataset, machine learning in healthcare, predictive modeling, clinical decision support systems
Procedia PDF Downloads 476631 Effects of Waist-to-Hip Ratio and Visceral Fat Measurements Improvement on Offshore Petrochemical Company Shift Employees' Work Efficiency
Authors: Essam Amerian
Abstract:
The aim of this study was to investigate the effects of improving waist-to-hip ratio (WHR) and visceral fat components on the health of shift workers in an offshore petrochemical company. A total of 100 male shift workers participated in the study, with an average age of 40.5 years and an average BMI of 28.2 kg/m². The study employed a randomized controlled trial design, with participants assigned to either an intervention group or a control group. The intervention group received a 12-week program that included dietary counseling, physical activity recommendations, and stress management techniques. The control group received no intervention. The outcomes measured were changes in WHR, visceral fat components, blood pressure, and lipid profile. The results showed that the intervention group had a statistically significant improvement in WHR (p<0.001) and visceral fat components (p<0.001) compared to the control group. Furthermore, there were statistically significant improvements in systolic blood pressure (p=0.015) and total cholesterol (p=0.034) in the intervention group compared to the control group. These findings suggest that implementing a 12-week program that includes dietary counseling, physical activity recommendations, and stress management techniques can effectively improve WHR, visceral fat components, and cardiovascular health among shift workers in an offshore petrochemical company.Keywords: body composition, waist-hip-ratio, visceral fat, shift worker, work efficiency
Procedia PDF Downloads 806630 Different Processing Methods to Obtain a Carbon Composite Element for Cycling
Authors: Maria Fonseca, Ana Branco, Joao Graca, Rui Mendes, Pedro Mimoso
Abstract:
The present work is focused on the production of a carbon composite element for cycling through different techniques, namely, blow-molding and high-pressure resin transfer injection (HP-RTM). The main objective of this work is to compare both processes to produce carbon composite elements for the cycling industry. It is well known that the carbon composite components for cycling are produced mainly through blow-molding; however, this technique depends strongly on manual labour, resulting in a time-consuming production process. Comparatively, HP-RTM offers a more automated process which should lead to higher production rates. Nevertheless, a comparison of the elements produced through both techniques must be done, in order to assess if the final products comply with the required standards of the industry. The main difference between said techniques lies in the used material. Blow-moulding uses carbon prepreg (carbon fibres pre-impregnated with a resin system), and the material is laid up by hand, piece by piece, on a mould or on a hard male. After that, the material is cured at a high temperature. On the other hand, in the HP-RTM technique, dry carbon fibres are placed on a mould, and then resin is injected at high pressure. After some research regarding the best material systems (prepregs and braids) and suppliers, an element was designed (similar to a handlebar) to be constructed. The next step was to perform FEM simulations in order to determine what the best layup of the composite material was. The simulations were done for the prepreg material, and the obtained layup was transposed to the braids. The selected material was a prepreg with T700 carbon fibre (24K) and an epoxy resin system, for the blow-molding technique. For HP-RTM, carbon fibre elastic UD tubes and ± 45º braids were used, with both 3K and 6K filaments per tow, and the resin system was an epoxy as well. After the simulations for the prepreg material, the optimized layup was: [45°, -45°,45°, -45°,0°,0°]. For HP-RTM, the transposed layup was [ ± 45° (6k); 0° (6k); partial ± 45° (6k); partial ± 45° (6k); ± 45° (3k); ± 45° (3k)]. The mechanical tests showed that both elements can withstand the maximum load (in this case, 1000 N); however, the one produced through blow-molding can support higher loads (≈1300N against 1100N from HP-RTM). In what concerns to the fibre volume fraction (FVF), the HP-RTM element has a slightly higher value ( > 61% compared to 59% of the blow-molding technique). The optical microscopy has shown that both elements have a low void content. In conclusion, the elements produced using HP-RTM can compare to the ones produced through blow-molding, both in mechanical testing and in the visual aspect. Nevertheless, there is still space for improvement in the HP-RTM elements since the layup of the braids, and UD tubes could be optimized.Keywords: HP-RTM, carbon composites, cycling, FEM
Procedia PDF Downloads 1346629 Ectopic Osteoinduction of Porous Composite Scaffolds Reinforced with Graphene Oxide and Hydroxyapatite Gradient Density
Authors: G. M. Vlasceanu, H. Iovu, E. Vasile, M. Ionita
Abstract:
Herein, the synthesis and characterization of chitosan-gelatin highly porous scaffold reinforced with graphene oxide, and hydroxyapatite (HAp), crosslinked with genipin was targeted. In tissue engineering, chitosan and gelatin are two of the most robust biopolymers with wide applicability due to intrinsic biocompatibility, biodegradability, low antigenicity properties, affordability, and ease of processing. HAp, per its exceptional activity in tuning cell-matrix interactions, is acknowledged for its capability of sustaining cellular proliferation by promoting bone-like native micro-media for cell adjustment. Genipin is regarded as a top class cross-linker, while graphene oxide (GO) is viewed as one of the most performant and versatile fillers. The composites with natural bone HAp/biopolymer ratio were obtained by cascading sonochemical treatments, followed by uncomplicated casting methods and by freeze-drying. Their structure was characterized by Fourier Transform Infrared Spectroscopy and X-ray Diffraction, while overall morphology was investigated by Scanning Electron Microscopy (SEM) and micro-Computer Tomography (µ-CT). Ensuing that, in vitro enzyme degradation was performed to detect the most promising compositions for the development of in vivo assays. Suitable GO dispersion was ascertained within the biopolymer mix as nanolayers specific signals lack in both FTIR and XRD spectra, and the specific spectral features of the polymers persisted with GO load enhancement. Overall, correlations between the GO induced material structuration, crystallinity variations, and chemical interaction of the compounds can be correlated with the physical features and bioactivity of each composite formulation. Moreover, the HAp distribution within follows an auspicious density gradient tuned for hybrid osseous/cartilage matter architectures, which were mirrored in the mice model tests. Hence, the synthesis route of a natural polymer blend/hydroxyapatite-graphene oxide composite material is anticipated to emerge as influential formulation in bone tissue engineering. Acknowledgement: This work was supported by the project 'Work-based learning systems using entrepreneurship grants for doctoral and post-doctoral students' (Sisteme de invatare bazate pe munca prin burse antreprenor pentru doctoranzi si postdoctoranzi) - SIMBA, SMIS code 124705 and by a grant of the National Authority for Scientific Research and Innovation, Operational Program Competitiveness Axis 1 - Section E, Program co-financed from European Regional Development Fund 'Investments for your future' under the project number 154/25.11.2016, P_37_221/2015. The nano-CT experiments were possible due to European Regional Development Fund through Competitiveness Operational Program 2014-2020, Priority axis 1, ID P_36_611, MySMIS code 107066, INOVABIOMED.Keywords: biopolymer blend, ectopic osteoinduction, graphene oxide composite, hydroxyapatite
Procedia PDF Downloads 1056628 High Resolution Satellite Imagery and Lidar Data for Object-Based Tree Species Classification in Quebec, Canada
Authors: Bilel Chalghaf, Mathieu Varin
Abstract:
Forest characterization in Quebec, Canada, is usually assessed based on photo-interpretation at the stand level. For species identification, this often results in a lack of precision. Very high spatial resolution imagery, such as DigitalGlobe, and Light Detection and Ranging (LiDAR), have the potential to overcome the limitations of aerial imagery. To date, few studies have used that data to map a large number of species at the tree level using machine learning techniques. The main objective of this study is to map 11 individual high tree species ( > 17m) at the tree level using an object-based approach in the broadleaf forest of Kenauk Nature, Quebec. For the individual tree crown segmentation, three canopy-height models (CHMs) from LiDAR data were assessed: 1) the original, 2) a filtered, and 3) a corrected model. The corrected CHM gave the best accuracy and was then coupled with imagery to refine tree species crown identification. When compared with photo-interpretation, 90% of the objects represented a single species. For modeling, 313 variables were derived from 16-band WorldView-3 imagery and LiDAR data, using radiance, reflectance, pixel, and object-based calculation techniques. Variable selection procedures were employed to reduce their number from 313 to 16, using only 11 bands to aid reproducibility. For classification, a global approach using all 11 species was compared to a semi-hierarchical hybrid classification approach at two levels: (1) tree type (broadleaf/conifer) and (2) individual broadleaf (five) and conifer (six) species. Five different model techniques were used: (1) support vector machine (SVM), (2) classification and regression tree (CART), (3) random forest (RF), (4) k-nearest neighbors (k-NN), and (5) linear discriminant analysis (LDA). Each model was tuned separately for all approaches and levels. For the global approach, the best model was the SVM using eight variables (overall accuracy (OA): 80%, Kappa: 0.77). With the semi-hierarchical hybrid approach, at the tree type level, the best model was the k-NN using six variables (OA: 100% and Kappa: 1.00). At the level of identifying broadleaf and conifer species, the best model was the SVM, with OA of 80% and 97% and Kappa values of 0.74 and 0.97, respectively, using seven variables for both models. This paper demonstrates that a hybrid classification approach gives better results and that using 16-band WorldView-3 with LiDAR data leads to more precise predictions for tree segmentation and classification, especially when the number of tree species is large.Keywords: tree species, object-based, classification, multispectral, machine learning, WorldView-3, LiDAR
Procedia PDF Downloads 1366627 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics
Authors: L. Freeborn
Abstract:
Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.Keywords: neuroimaging studies, research design, second language acquisition, task validity
Procedia PDF Downloads 1416626 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1096625 The Environmental Impact of Geothermal Energy and Opportunities for Its Utilization in Hungary
Authors: András Medve, Katalin Szabad, István Patkó
Abstract:
According to the International Energy Association the previous principles of the energy sector should be reassessed, in which renewable energy sources have a significant role. We might witness the exchange of roles of countries from importer to exporter, which look for the main resources of market needs. According to the World Energy Outlook 2013, the duration of high oil prices is exceptionally long in the history of the energy market. Forecasts also point at the expected great differences between the regional prices of gas and electric energy. The energy need of the world will grow by its third. two thirds of which will appear in China, India, and South-East Asia, while only 4 per cent of which will be related to OECD countries. Current trends also forecast the growth of the price of energy sources and the emission of glasshouse gases. As a reflection of these forecasts alternative energy sources will gain value, of which geothermic energy is one of the cheapest and most economical. Hungary possesses outstanding resources of geothermic energy. The aim of the study is to research the environmental effects of geothermic energy and the opportunities of its exploitation in Hungary, related to „Horizon 2020” project.Keywords: sustainable energy, renewable energy, development of geothermic energy in Hungary
Procedia PDF Downloads 6066624 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases
Authors: Mohammad A. Bani-Khaled
Abstract:
In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams
Procedia PDF Downloads 421