Search results for: hyperspectral image classification using tree search algorithm
5620 The Effects of Functionality Level on Gait in Subjects with Low Back Pain
Authors: Vedat Kurt, Tansel Koyunoglu, Gamze Kurt, Ozgen Aras
Abstract:
Low back pain is one of the most common health problem in public. Common symptoms that can be associated with low back pain include; pain, functional disability, gait disturbances. The aim of the study was to investigate the differences between disability scores and gait parameters in subjects with low back pain. Sixty participants are included in our study, (35 men, 25 women, mean age: 37.65±10.02 years). Demographic characteristics of participants were recorded. Pain (visual analog scale) and disability level (Oswestry Disability Index(ODI)) were evaluated. Gait parameters were measured with Zebris-FDM-2 platform. Independent samples t-test was used to analyse the differences between subjects with under 40 points (n=31, mean age:35.8±11.3) and above 40 points (n=29, mean age:39.6±8.1) of ODI scores. Significant level in statistical analysis was accepted as 0.05. There was no significant difference between the ODI scores and groups’ ages. Statistically significant differences were found in step width between subjects with under 40 points of ODI and above 40 points of ODI score(p < 0.05). But there were non-significant differences with other gait parameters (p > 0.05). The differences between gait parameters and pain scores were not statistically significant (p > 0.05). Researchers generally agree that individuals with LBP walk slower and take shorter steps and have asymmetric step lengths when compared with than their age-matched pain-free counterparts. Also perceived general disability may have moderate correlation with walking performance. In the current study, the patients classified as minimal/moderate and severe disability level by using ODI scores. As a result, a patient with LBP who have higher disability level tends to increase support surface. On the other hand, we did not find any relation between pain intensity and gait parameters. It may be caused by the classification system of pain scores. Additional research is needed to investigate the effects of functionality level and pain intensity on gait in subjects with low back pain under different classification types.Keywords: functionality, low back pain, gait, pain
Procedia PDF Downloads 2885619 Experimenting the Influence of Input Modality on Involvement Load Hypothesis
Authors: Mohammad Hassanzadeh
Abstract:
As far as incidental vocabulary learning is concerned, the basic contention of the Involvement Load Hypothesis (ILH) is that retention of unfamiliar words is, generally, conditional upon the degree of involvement in processing them. This study examined input modality and incidental vocabulary uptake in a task-induced setting whereby three variously loaded task types (marginal glosses, fill-in-task, and sentence-writing) were alternately assigned to one group of students at Allameh Tabataba’i University (n=2l) during six classroom sessions. While one round of exposure was comprised of the audiovisual medium (TV talk shows), the second round consisted of textual materials with approximately similar subject matter (reading texts). In both conditions, however, the tasks were equivalent to one another. Taken together, the study pursued the dual objectives of establishing a litmus test for the ILH and its proposed values of ‘need’, ‘search’ and ‘evaluation’ in the first place. Secondly, it sought to bring to light the superiority issue of exposure to audiovisual input versus the written input as far as the incorporation of tasks is concerned. At the end of each treatment session, a vocabulary active recall test was administered to measure their incidental gains. Running a one-way analysis of variance revealed that the audiovisual intervention yielded higher gains than the written version even when differing tasks were included. Meanwhile, task 'three' (sentence-writing) turned out the most efficient in tapping learners' active recall of the target vocabulary items. In addition to shedding light on the superiority of audiovisual input over the written input when circumstances are relatively held constant, this study for the most part, did support the underlying tenets of ILH.Keywords: Keywords— Evaluation, incidental vocabulary learning, input mode, Involvement Load Hypothesis, need, search.
Procedia PDF Downloads 2825618 Quality of Service Based Routing Algorithm for Real Time Applications in MANETs Using Ant Colony and Fuzzy Logic
Authors: Farahnaz Karami
Abstract:
Routing is an important, challenging task in mobile ad hoc networks due to node mobility, lack of central control, unstable links, and limited resources. An ant colony has been found to be an attractive technique for routing in Mobile Ad Hoc Networks (MANETs). However, existing swarm intelligence based routing protocols find an optimal path by considering only one or two route selection metrics without considering correlations among such parameters making them unsuitable lonely for routing real time applications. Fuzzy logic combines multiple route selection parameters containing uncertain information or imprecise data in nature, but does not have multipath routing property naturally in order to provide load balancing. The objective of this paper is to design a routing algorithm using fuzzy logic and ant colony that can solve some of routing problems in mobile ad hoc networks, such as nodes energy consumption optimization to increase network lifetime, link failures rate reduction to increase packet delivery reliability and providing load balancing to optimize available bandwidth. In proposed algorithm, the path information will be given to fuzzy inference system by ants. Based on the available path information and considering the parameters required for quality of service (QoS), the fuzzy cost of each path is calculated and the optimal paths will be selected. NS2.35 simulation tools are used for simulation and the results are compared and evaluated with the newest QoS based algorithms in MANETs according to packet delivery ratio, end-to-end delay and routing overhead ratio criterions. The simulation results show significant improvement in the performance of these networks in terms of decreasing end-to-end delay, and routing overhead ratio, and also increasing packet delivery ratio.Keywords: mobile ad hoc networks, routing, quality of service, ant colony, fuzzy logic
Procedia PDF Downloads 675617 A Monolithic Arbitrary Lagrangian-Eulerian Finite Element Strategy for Partly Submerged Solid in Incompressible Fluid with Mortar Method for Modeling the Contact Surface
Authors: Suman Dutta, Manish Agrawal, C. S. Jog
Abstract:
Accurate computation of hydrodynamic forces on floating structures and their deformation finds application in the ocean and naval engineering and wave energy harvesting. This manuscript presents a monolithic, finite element strategy for fluid-structure interaction involving hyper-elastic solids partly submerged in an incompressible fluid. A velocity-based Arbitrary Lagrangian-Eulerian (ALE) formulation has been used for the fluid and a displacement-based Lagrangian approach has been used for the solid. The flexibility of the ALE technique permits us to treat the free surface of the fluid as a Lagrangian entity. At the interface, the continuity of displacement, velocity and traction are enforced using the mortar method. In the mortar method, the constraints are enforced in a weak sense using the Lagrange multiplier method. In the literature, the mortar method has been shown to be robust in solving various contact mechanics problems. The time-stepping strategy used in this work reduces to the generalized trapezoidal rule in the Eulerian setting. In the Lagrangian limit, in the absence of external load, the algorithm conserves the linear and angular momentum and the total energy of the system. The use of monolithic coupling with an energy-conserving time-stepping strategy gives an unconditionally stable algorithm and allows the user to take large time steps. All the governing equations and boundary conditions have been mapped to the reference configuration. The use of the exact tangent stiffness matrix ensures that the algorithm converges quadratically within each time step. The robustness and good performance of the proposed method are demonstrated by solving benchmark problems from the literature.Keywords: ALE, floating body, fluid-structure interaction, monolithic, mortar method
Procedia PDF Downloads 2775616 Review of Numerical Models for Granular Beds in Solar Rotary Kilns for Thermal Applications
Authors: Edgar Willy Rimarachin Valderrama, Eduardo Rojas Parra
Abstract:
Thermal energy from solar radiation is widely present in power plants, food drying, chemical reactors, heating and cooling systems, water treatment processes, hydrogen production, and others. In the case of power plants, one of the technologies available to transform solar energy into thermal energy is by solar rotary kilns where a bed of granular matter is heated through concentrated radiation obtained from an arrangement of heliostats. Numerical modeling is a useful approach to study the behavior of granular beds in solar rotary kilns. This technique, once validated with small-scale experiments, can be used to simulate large-scale processes for industrial applications. This study gives a comprehensive classification of numerical models used to simulate the movement and heat transfer for beds of granular media within solar rotary furnaces. In general, there exist three categories of models: 1) continuum, 2) discrete, and 3) multiphysics modeling. The continuum modeling considers zero-dimensional, one-dimensional and fluid-like models. On the other hand, the discrete element models compute the movement of each particle of the bed individually. In this kind of modeling, the heat transfer acts during contacts, which can occur by solid-solid and solid-gas-solid conduction. Finally, the multiphysics approach considers discrete elements to simulate grains and a continuous modeling to simulate the fluid around particles. This classification allows to compare the advantages and disadvantages for each kind of model in terms of accuracy, computational cost and implementation.Keywords: granular beds, numerical models, rotary kilns, solar thermal applications
Procedia PDF Downloads 505615 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis
Authors: Mehrnaz Mostafavi
Abstract:
The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans
Procedia PDF Downloads 1085614 Unraveling the Evolution of Mycoplasma Hominis Through Its Genome Sequence
Authors: Boutheina Ben Abdelmoumen Mardassi, Salim Chibani, Safa Boujemaa, Amaury Vaysse, Julien Guglielmini, Elhem Yacoub
Abstract:
Background and aim: Mycoplasma hominis (MH) is a pathogenic bacterium belonging to the Mollicutes class. It causes a wide range of gynecological infections and infertility among adults. Recently, we have explored for the first time the phylodistribution of Tunisian M. hominis clinical strains using an expanded MLST. We have demonstrated their distinction into two pure lineages, which each corresponding to a specific pathotype: genital infections and infertility. The aim of this project is to gain further insight into the evolutionary dynamics and the specific genetic factors that distinguish MH pathotypes Methods: Whole genome sequencing of Mycoplasma hominis clinical strains was performed using illumina Miseq. Denovo assembly was performed using a publicly available in-house pipeline. We used prokka to annotate the genomes, panaroo to generate the gene presence matrix and Jolytree to establish the phylogenetic tree. We used treeWAS to identify genetic loci associated with the pathothype of interest from the presence matrix and phylogenetic tree. Results: Our results revealed a clear categorization of the 62 MH clinical strains into two distinct genetic lineages, with each corresponding to a specific pathotype.; gynecological infections and infertility[AV1] . Genome annotation showed that GC content is ranging between 26 and 27%, which is a known characteristic of Mycoplasma genome. Housekeeping genes belonging to the core genome are highly conserved among our strains. TreeWas identified 4 virulence genes associated with the pathotype gynecological infection. encoding for asparagine--tRNA ligase, restriction endonuclease subunit S, Eco47II restriction endonuclease, and transcription regulator XRE (involved in tolerance to oxidative stress). Five genes have been identified that have a statistical association with infertility, tow lipoprotein, one hypothetical protein, a glycosyl transferase involved in capsule synthesis, and pyruvate kinase involved in biofilm formation. All strains harbored an efflux pomp that belongs to the family of multidrug resistance ABC transporter, which confers resistance to a wide range of antibiotics. Indeed many adhesion factors and lipoproteins (p120, p120', p60, p80, Vaa) have been checked and confirmed in our strains with a relatively 99 % to 96 % conserved domain and hypervariable domain that represent 1 to 4 % of the reference sequence extracted from gene bank. Conclusion: In summary, this study led to the identification of specific genetic loci associated with distinct pathotypes in M hominis.Keywords: mycoplasma hominis, infertility, gynecological infections, virulence genes, antibiotic resistance
Procedia PDF Downloads 1055613 Stabilization of Lateritic Soil Sample from Ijoko with Cement Kiln Dust and Lime
Authors: Akinbuluma Ayodeji Theophilus, Adewale Olutaiwo
Abstract:
When building roads and paved surfaces, a strong foundation is always essential. A durable material that can withstand years of traffic while staying trustworthy must be used to build the foundation. A frequent problem in the construction of roads and pavements is the lack of high-quality, long-lasting materials for the pavement structure (base, subbase, and subgrade). Hence, this study examined the stabilization of lateritic soil samples from Ijoko with cement kiln dust and lime. The study adopted the experimental design. Laboratory tests were conducted on classification, swelling potential, compaction, California bearing ratio (CBR), and unconfined compressive tests, among others, were conducted on the laterite sample treated with cement kiln dust (CKD) and lime in incremental order of 2% up to 10% of dry weight soft soil sample. The results of the test showed that the studied soil could be classified as an A-7-6 and CL soil using the American Association of State Highway and transport officials (AASHTO) and the unified soil classification system (USCS), respectively. The plasticity (PI) of the studied soil reduced from 30.5% to 29.9% at the application of CKD. The maximum dry density on the application of CKD reduced from 1.9.7 mg/m3 to 1.86mg/m3, and lime application yielded a reduction from 1.97mg/m3 to 1.88.mg/m3. The swell potential on CKD application was reduced from 0.05 to 0.039%. The study concluded that soil stabilizations are effective and economic way of improving road pavement for engineering benefit. The degree of effectiveness of stabilization in pavement construction was found to depend on the type of soil to be stabilized. The study therefore recommended that stabilized soil mixtures should be used to subbase material for flexible pavement since is a suitable.Keywords: lateritic soils, sand, cement, stabilization, road pavement
Procedia PDF Downloads 945612 Competitors’ Influence Analysis of a Retailer by Using Customer Value and Huff’s Gravity Model
Authors: Yepeng Cheng, Yasuhiko Morimoto
Abstract:
Customer relationship analysis is vital for retail stores, especially for supermarkets. The point of sale (POS) systems make it possible to record the daily purchasing behaviors of customers as an identification point of sale (ID-POS) database, which can be used to analyze customer behaviors of a supermarket. The customer value is an indicator based on ID-POS database for detecting the customer loyalty of a store. In general, there are many supermarkets in a city, and other nearby competitor supermarkets significantly affect the customer value of customers of a supermarket. However, it is impossible to get detailed ID-POS databases of competitor supermarkets. This study firstly focused on the customer value and distance between a customer's home and supermarkets in a city, and then constructed the models based on logistic regression analysis to analyze correlations between distance and purchasing behaviors only from a POS database of a supermarket chain. During the modeling process, there are three primary problems existed, including the incomparable problem of customer values, the multicollinearity problem among customer value and distance data, and the number of valid partial regression coefficients. The improved customer value, Huff’s gravity model, and inverse attractiveness frequency are considered to solve these problems. This paper presents three types of models based on these three methods for loyal customer classification and competitors’ influence analysis. In numerical experiments, all types of models are useful for loyal customer classification. The type of model, including all three methods, is the most superior one for evaluating the influence of the other nearby supermarkets on customers' purchasing of a supermarket chain from the viewpoint of valid partial regression coefficients and accuracy.Keywords: customer value, Huff's Gravity Model, POS, Retailer
Procedia PDF Downloads 1275611 A Fuzzy Multiobjective Model for Bed Allocation Optimized by Artificial Bee Colony Algorithm
Authors: Jalal Abdulkareem Sultan, Abdulhakeem Luqman Hasan
Abstract:
With the development of health care systems competition, hospitals face more and more pressures. Meanwhile, resource allocation has a vital effect on achieving competitive advantages in hospitals. Selecting the appropriate number of beds is one of the most important sections in hospital management. However, in real situation, bed allocation selection is a multiple objective problem about different items with vagueness and randomness of the data. It is very complex. Hence, research about bed allocation problem is relatively scarce under considering multiple departments, nursing hours, and stochastic information about arrival and service of patients. In this paper, we develop a fuzzy multiobjective bed allocation model for overcoming uncertainty and multiple departments. Fuzzy objectives and weights are simultaneously applied to help the managers to select the suitable beds about different departments. The proposed model is solved by using Artificial Bee Colony (ABC), which is a very effective algorithm. The paper describes an application of the model, dealing with a public hospital in Iraq. The results related that fuzzy multi-objective model was presented suitable framework for bed allocation and optimum use.Keywords: bed allocation problem, fuzzy logic, artificial bee colony, multi-objective optimization
Procedia PDF Downloads 3295610 Optical-Based Lane-Assist System for Rowing Boats
Authors: Stephen Tullis, M. David DiDonato, Hong Sung Park
Abstract:
Rowing boats (shells) are often steered by a small rudder operated by one of the backward-facing rowers; the attention required of that athlete then slightly decreases the power that that athlete can provide. Reducing the steering distraction would then increase the overall boat speed. Races are straight 2000 m courses with each boat in a 13.5 m wide lane marked by small (~15 cm) widely-spaced (~10 m) buoys, and the boat trajectory is affected by both cross-currents and winds. An optical buoy recognition and tracking system has been developed that provides the boat’s location and orientation with respect to the lane edges. This information is provided to the steering athlete as either: a simple overlay on a video display, or fed to a simplified autopilot system giving steering directions to the athlete or directly controlling the rudder. The system is then effectively a “lane-assist” device but with small, widely-spaced lane markers viewed from a very shallow angle due to constraints on camera height. The image is captured with a lightweight 1080p webcam, and most of the image analysis is done in OpenCV. The colour RGB-image is converted to a grayscale using the difference of the red and blue channels, which provides good contrast between the red/yellow buoys and the water, sky, land background and white reflections and noise. Buoy detection is done with thresholding within a tight mask applied to the image. Robust linear regression using Tukey’s biweight estimator of the previously detected buoy locations is used to develop the mask; this avoids the false detection of noise such as waves (reflections) and, in particular, buoys in other lanes. The robust regression also provides the current lane edges in the camera frame that are used to calculate the displacement of the boat from the lane centre (lane location), and its yaw angle. The interception of the detected lane edges provides a lane vanishing point, and yaw angle can be calculated simply based on the displacement of this vanishing point from the camera axis and the image plane distance. Lane location is simply based on the lateral displacement of the vanishing point from any horizontal cut through the lane edges. The boat lane position and yaw are currently fed what is essentially a stripped down marine auto-pilot system. Currently, only the lane location is used in a PID controller of a rudder actuator with integrator anti-windup to deal with saturation of the rudder angle. Low Kp and Kd values decrease unnecessarily fast return to lane centrelines and response to noise, and limiters can be used to avoid lane departure and disqualification. Yaw is not used as a control input, as cross-winds and currents can cause a straight course with considerable yaw or crab angle. Mapping of the controller with rudder angle “overall effectiveness” has not been finalized - very large rudder angles stall and have decreased turning moments, but at less extreme angles the increased rudder drag slows the boat and upsets boat balance. The full system has many features similar to automotive lane-assist systems, but with the added constraints of the lane markers, camera positioning, control response and noise increasing the challenge.Keywords: auto-pilot, lane-assist, marine, optical, rowing
Procedia PDF Downloads 1355609 Optimisation of Intermodal Transport Chain of Supermarkets on Isle of Wight, UK
Authors: Jingya Liu, Yue Wu, Jiabin Luo
Abstract:
This work investigates an intermodal transportation system for delivering goods from a Regional Distribution Centre to supermarkets on the Isle of Wight (IOW) via the port of Southampton or Portsmouth in the UK. We consider this integrated logistics chain as a 3-echelon transportation system. In such a system, there are two types of transport methods used to deliver goods across the Solent Channel: one is accompanied transport, which is used by most supermarkets on the IOW, such as Spar, Lidl and Co-operative food; the other is unaccompanied transport, which is used by Aldi. Five transport scenarios are studied based on different transport modes and ferry routes. The aim is to determine an optimal delivery plan for supermarkets of different business scales on IOW, in order to minimise the total running cost, fuel consumptions and carbon emissions. The problem is modelled as a vehicle routing problem with time windows and solved by genetic algorithm. The computing results suggested that accompanied transport is more cost efficient for small and medium business-scale supermarket chains on IOW, while unaccompanied transport has the potential to improve the efficiency and effectiveness of large business scale supermarket chains.Keywords: genetic algorithm, intermodal transport system, Isle of Wight, optimization, supermarket
Procedia PDF Downloads 3745608 Acculturation Impact on Mental Health Among Arab Americans
Authors: Sally Kafelghazal
Abstract:
Introduction: Arab Americans, who include immigrants, refugees, or U.S. born persons of Middle Eastern or North African descent, may experience significant difficulties during acculturation to Western society. Influential stressors include relocation, loss of social support, language barriers, and economic factors, all of which can impact mental health. There is limited research investigating the effects of acculturation on the mental health of the Arab American population. Objectives: The purpose of this study is to identify ways in which acculturation impacts the mental health of Arab Americans, specifically the development of depression and anxiety. Method: A literature search was conducted using PubMed and PsycArticles (ProQuest), utilizing the following search terms: “Arab Americans,” “Arabs,” “mental health,” “depression,” “anxiety,” “acculturation.” Thirty-nine articles were identified and of those, nine specifically investigated the relationship between acculturation and mental health in Arab Americans. Three of the nine focused exclusively on depression. Results: Several risk factors were identified that contribute to poor mental health associated with acculturation, which include immigrant or refugee status, facing discrimination, and religious ideology. Protective factors include greater levels of acculturation, being U.S. born, and greater heritage identity. Greater mental health disorders were identified in Arab Americans compared to normative samples, perhaps particularly depression; none of the articles specifically addressed anxiety. Conclusion: The current research findings support the potential association between the process of acculturation and greater levels of mental health disorders in Arab Americans. However, the diversity of the Arab American population makes it difficult to draw consistent conclusions. Further research needs to be conducted in order to assess which subgroups in the Arab American population are at highest risk for developing new or exacerbating existing mental health disorders in order to devise more effective interventions.Keywords: arab americans, arabs, mental health, anxiety, depression, acculturation
Procedia PDF Downloads 835607 Introduction of Integrated Image Deep Learning Solution and How It Brought Laboratorial Level Heart Rate and Blood Oxygen Results to Everyone
Authors: Zhuang Hou, Xiaolei Cao
Abstract:
The general public and medical professionals recognized the importance of accurately measuring and storing blood oxygen levels and heart rate during the COVID-19 pandemic. The demand for accurate contactless devices was motivated by the need for cross-infection reduction and the shortage of conventional oximeters, partially due to the global supply chain issue. This paper evaluated a contactless mini program HealthyPai’s heart rate (HR) and oxygen saturation (SpO2) measurements compared with other wearable devices. In the HR study of 185 samples (81 in the laboratory environment, 104 in the real-life environment), the mean absolute error (MAE) ± standard deviation was 1.4827 ± 1.7452 in the lab, 6.9231 ± 5.6426 in the real-life setting. In the SpO2 study of 24 samples, the MAE ± standard deviation of the measurement was 1.0375 ± 0.7745. Our results validated that HealthyPai utilizing the Integrated Image Deep Learning Solution (IIDLS) framework, can accurately measure HR and SpO2, providing the test quality at least comparable to other FDA-approved wearable devices in the market and surpassing the consumer-grade and research-grade wearable standards.Keywords: remote photoplethysmography, heart rate, oxygen saturation, contactless measurement, mini program
Procedia PDF Downloads 1395606 Feasibility Study of Particle Image Velocimetry in the Muzzle Flow Fields during the Intermediate Ballistic Phase
Authors: Moumen Abdelhafidh, Stribu Bogdan, Laboureur Delphine, Gallant Johan, Hendrick Patrick
Abstract:
This study is part of an ongoing effort to improve the understanding of phenomena occurring during the intermediate ballistic phase, such as muzzle flows. A thorough comprehension of muzzle flow fields is essential for optimizing muzzle device and projectile design. This flow characterization has heretofore been almost entirely limited to local and intrusive measurement techniques such as pressure measurements using pencil probes. Consequently, the body of quantitative experimental data is limited, so is the number of numerical codes validated in this field. The objective of the work presented here is to demonstrate the applicability of the Particle Image Velocimetry (PIV) technique in the challenging environment of the propellant flow of a .300 blackout weapon to provide accurate velocity measurements. The key points of a successful PIV measurement are the selection of the particle tracer, their seeding technique, and their tracking characteristics. We have experimentally investigated the aforementioned points by evaluating the resistance, gas dispersion, laser light reflection as well as the response to a step change across the Mach disk for five different solid tracers using two seeding methods. To this end, an experimental setup has been performed and consisted of a PIV system, the combustion chamber pressure measurement, classical high-speed schlieren visualization, and an aerosol spectrometer. The latter is used to determine the particle size distribution in the muzzle flow. The experimental results demonstrated the ability of PIV to accurately resolve the salient features of the propellant flow, such as the under the expanded jet and vortex rings, as well as the instantaneous velocity field with maximum centreline velocities of more than 1000 m/s. Besides, naturally present unburned particles in the gas and solid ZrO₂ particles with a nominal size of 100 nm, when coated on the propellant powder, are suitable as tracers. However, the TiO₂ particles intended to act as a tracer, surprisingly not only melted but also functioned as a combustion accelerator and decreased the number of particles in the propellant gas.Keywords: intermediate ballistic, muzzle flow fields, particle image velocimetry, propellant gas, particle size distribution, under expanded jet, solid particle tracers
Procedia PDF Downloads 1665605 2D Hexagonal Cellular Automata: The Complexity of Forms
Authors: Vural Erdogan
Abstract:
We created two-dimensional hexagonal cellular automata to obtain complexity by using simple rules same as Conway’s game of life. Considering the game of life rules, Wolfram's works about life-like structures and John von Neumann's self-replication, self-maintenance, self-reproduction problems, we developed 2-states and 3-states hexagonal growing algorithms that reach large populations through random initial states. Unlike the game of life, we used six neighbourhoods cellular automata instead of eight or four neighbourhoods. First simulations explained that whether we are able to obtain sort of oscillators, blinkers, and gliders. Inspired by Wolfram's 1D cellular automata complexity and life-like structures, we simulated 2D synchronous, discrete, deterministic cellular automata to reach life-like forms with 2-states cells. The life-like formations and the oscillators have been explained how they contribute to initiating self-maintenance together with self-reproduction and self-replication. After comparing simulation results, we decided to develop the algorithm for another step. Appending a new state to the same algorithm, which we used for reaching life-like structures, led us to experiment new branching and fractal forms. All these studies tried to demonstrate that complex life forms might come from uncomplicated rules.Keywords: hexagonal cellular automata, self-replication, self-reproduction, self- maintenance
Procedia PDF Downloads 1565604 Efficacy of Celecoxib Adjunct Treatment on Bipolar Disorder: Systematic Review and Meta-Analysis
Authors: Daniela V. Bavaresco, Tamy Colonetti, Antonio Jose Grande, Francesc Colom, Joao Quevedo, Samira S. Valvassori, Maria Ines da Rosa
Abstract:
Objective: Performed a systematic review and meta-analysis to evaluated the potential effect of the cyclo-oxygenases (Cox)-2 inhibitor Celecoxib adjunct treatment in Bipolar Disorder (BD), through of randomized controlled trials. Method: A search of the electronic databases was proceeded, on MEDLINE, EMBASE, Scopus, Cochrane Central Register of Controlled Trials (CENTRAL), Biomed Central, Web of Science, IBECS, LILACS, PsycINFO (American Psychological Association), Congress Abstracts, and Grey literature (Google Scholar and the British Library) for studies published from January 1990 to February 2018. A search strategy was developed using the terms: 'Bipolar disorder' or 'Bipolar mania' or 'Bipolar depression' or 'Bipolar mixed' or 'Bipolar euthymic' and 'Celecoxib' or 'Cyclooxygenase-2 inhibitors' or 'Cox-2 inhibitors' as text words and Medical Subject Headings (i.e., MeSH and EMTREE) and searched. The therapeutic effects of adjunctive treatment with Celecoxib were analyzed, it was possible to carry out a meta-analysis of three studies included in the systematic review. The meta-analysis was performed including the final results of the Young Mania Rating Scale (YMRS) at the end of randomized controlled trials (RCT). Results: Three primary studies were included in the systematic review, with a total of 121 patients. The meta-analysis had significant effect in the YMRS scores from patients with BD who used Celecoxib adjuvant treatment in comparison to placebo. The weighted mean difference was 5.54 (95%CI=3.26-7.82); p < 0.001; I2 =0%). Conclusion: The systematic review suggests that adjuvant treatment with Celecoxib improves the response of major treatments in patients with BD when compared with adjuvant placebo treatment.Keywords: bipolar disorder, Cox-2 inhibitors, Celecoxib, systematic review, meta-analysis
Procedia PDF Downloads 4935603 Multivariate Data Analysis for Automatic Atrial Fibrillation Detection
Authors: Zouhair Haddi, Stephane Delliaux, Jean-Francois Pons, Ismail Kechaf, Jean-Claude De Haro, Mustapha Ouladsine
Abstract:
Atrial fibrillation (AF) has been considered as the most common cardiac arrhythmia, and a major public health burden associated with significant morbidity and mortality. Nowadays, telemedical approaches targeting cardiac outpatients situate AF among the most challenged medical issues. The automatic, early, and fast AF detection is still a major concern for the healthcare professional. Several algorithms based on univariate analysis have been developed to detect atrial fibrillation. However, the published results do not show satisfactory classification accuracy. This work was aimed at resolving this shortcoming by proposing multivariate data analysis methods for automatic AF detection. Four publicly-accessible sets of clinical data (AF Termination Challenge Database, MIT-BIH AF, Normal Sinus Rhythm RR Interval Database, and MIT-BIH Normal Sinus Rhythm Databases) were used for assessment. All time series were segmented in 1 min RR intervals window and then four specific features were calculated. Two pattern recognition methods, i.e., Principal Component Analysis (PCA) and Learning Vector Quantization (LVQ) neural network were used to develop classification models. PCA, as a feature reduction method, was employed to find important features to discriminate between AF and Normal Sinus Rhythm. Despite its very simple structure, the results show that the LVQ model performs better on the analyzed databases than do existing algorithms, with high sensitivity and specificity (99.19% and 99.39%, respectively). The proposed AF detection holds several interesting properties, and can be implemented with just a few arithmetical operations which make it a suitable choice for telecare applications.Keywords: atrial fibrillation, multivariate data analysis, automatic detection, telemedicine
Procedia PDF Downloads 2715602 Cost Sensitive Feature Selection in Decision-Theoretic Rough Set Models for Customer Churn Prediction: The Case of Telecommunication Sector Customers
Authors: Emel Kızılkaya Aydogan, Mihrimah Ozmen, Yılmaz Delice
Abstract:
In recent days, there is a change and the ongoing development of the telecommunications sector in the global market. In this sector, churn analysis techniques are commonly used for analysing why some customers terminate their service subscriptions prematurely. In addition, customer churn is utmost significant in this sector since it causes to important business loss. Many companies make various researches in order to prevent losses while increasing customer loyalty. Although a large quantity of accumulated data is available in this sector, their usefulness is limited by data quality and relevance. In this paper, a cost-sensitive feature selection framework is developed aiming to obtain the feature reducts to predict customer churn. The framework is a cost based optional pre-processing stage to remove redundant features for churn management. In addition, this cost-based feature selection algorithm is applied in a telecommunication company in Turkey and the results obtained with this algorithm.Keywords: churn prediction, data mining, decision-theoretic rough set, feature selection
Procedia PDF Downloads 4495601 Application of Random Forest Model in The Prediction of River Water Quality
Authors: Turuganti Venkateswarlu, Jagadeesh Anmala
Abstract:
Excessive runoffs from various non-point source land uses, and other point sources are rapidly contaminating the water quality of streams in the Upper Green River watershed, Kentucky, USA. It is essential to maintain the stream water quality as the river basin is one of the major freshwater sources in this province. It is also important to understand the water quality parameters (WQPs) quantitatively and qualitatively along with their important features as stream water is sensitive to climatic events and land-use practices. In this paper, a model was developed for predicting one of the significant WQPs, Fecal Coliform (FC) from precipitation, temperature, urban land use factor (ULUF), agricultural land use factor (ALUF), and forest land-use factor (FLUF) using Random Forest (RF) algorithm. The RF model, a novel ensemble learning algorithm, can even find out advanced feature importance characteristics from the given model inputs for different combinations. This model’s outcomes showed a good correlation between FC and climate events and land use factors (R2 = 0.94) and precipitation and temperature are the primary influencing factors for FC.Keywords: water quality, land use factors, random forest, fecal coliform
Procedia PDF Downloads 2015600 Training of Future Computer Science Teachers Based on Machine Learning Methods
Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova
Abstract:
The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.Keywords: algorithm, artificial intelligence, education, machine learning
Procedia PDF Downloads 775599 From Scalpel to Leadership: The Landscape for Female Neurosurgeons in the UK
Authors: Anda-veronica Gherman, Dimitrios Varthalitis
Abstract:
Neurosurgery, like many surgical specialties, undoubtedly exhibits a significant gender gap, particularly in leadership positions. While increasing women representation in neurosurgery is important, it is crucial to increase their presence in leadership positions. Across the globe and Europe there are concerning trends of only 4% of all neurosurgical departments being chaired by women. This study aims to explore the situation regarding gender disparities in leadership in the United Kingdom and to identify possible contributing factors as well as discussing future strategies to bridge this gap. Methods: A literature review was conducted utilising PubMed as main database with search keywords including ‘female neurosurgeon’, ‘women neurosurgeon’, ‘gender disparity’, ‘leadership’ and ‘UK’. Additionally, a manual search of all neurosurgical departments in the UK was performed to identify the current female department leads and training director leads. Results: The literature search identified a paucity of literature addressing specifically leadership in female neurosurgeons within the UK, with very few published papers specifically on this topic. Despite more than half of medical students in the UK being female, only a small proportion pursue a surgical career, with neurosurgery being one of the least represented specialties. Only 27% of trainee neurosurgeons are female, and numbers are even lower at a consultant level, where women represent just 8%.Findings from published studies indicated that only 6.6% of leadership positions in neurosurgery are occupied by women in the UK. Furthermore, our manual searches across UK neurosurgical departments revealed that around 5% of department lead positions are currently held by women. While this figure is slightly higher than the European average of 4%, it remains lower compared to figures of 10% in other North-West European countries. The situation is slightly more positive looking at the training directors, with 15% being female. Discussion: The findings of this study highlight a significant gender disparity in leadership positions within neurosurgery in the UK, which may have important implications, perpetuating the lack of diversity on the decision-making process, limiting the career advancement opportunities of women and depriving the neurosurgical field from the voices, opinions and talents of women. With women representing half of the population, there is an undeniable need for more female leaders at the policy-making level. There are many barriers that can contribute to these numbers, including bias, stereotypes, lack of mentorship and work-like balance. A few solutions to overcome these barriers can be training programs addressing bias and impostor syndrome, leadership workshops tailored for female needs, better workplace policies, increased in formal mentorship and increasing the visibility of women in neurosurgery leadership positions through media, speaking opportunities, conferences, awards etc. And lastly, more research efforts should focus on the leadership and mentorship of women in neurosurgery, with an increased number of published papers discussing these issues.Keywords: female neurosurgeons, female leadership, female mentorship, gender disparities
Procedia PDF Downloads 365598 Emotion Detection in Twitter Messages Using Combination of Long Short-Term Memory and Convolutional Deep Neural Networks
Authors: Bahareh Golchin, Nooshin Riahi
Abstract:
One of the most significant issues as attended a lot in recent years is that of recognizing the sentiments and emotions in social media texts. The analysis of sentiments and emotions is intended to recognize the conceptual information such as the opinions, feelings, attitudes and emotions of people towards the products, services, organizations, people, topics, events and features in the written text. These indicate the greatness of the problem space. In the real world, businesses and organizations are always looking for tools to gather ideas, emotions, and directions of people about their products, services, or events related to their own. This article uses the Twitter social network, one of the most popular social networks with about 420 million active users, to extract data. Using this social network, users can share their information and opinions about personal issues, policies, products, events, etc. It can be used with appropriate classification of emotional states due to the availability of its data. In this study, supervised learning and deep neural network algorithms are used to classify the emotional states of Twitter users. The use of deep learning methods to increase the learning capacity of the model is an advantage due to the large amount of available data. Tweets collected on various topics are classified into four classes using a combination of two Bidirectional Long Short Term Memory network and a Convolutional network. The results obtained from this study with an average accuracy of 93%, show good results extracted from the proposed framework and improved accuracy compared to previous work.Keywords: emotion classification, sentiment analysis, social networks, deep neural networks
Procedia PDF Downloads 1425597 Sinusoidal Roughness Elements in a Square Cavity
Authors: Muhammad Yousaf, Shoaib Usman
Abstract:
Numerical studies were conducted using Lattice Boltzmann Method (LBM) to study the natural convection in a square cavity in the presence of roughness. An algorithm basedon a single relaxation time Bhatnagar-Gross-Krook (BGK) model of Lattice Boltzmann Method (LBM) was developed. Roughness was introduced on both the hot and cold walls in the form of sinusoidal roughness elements. The study was conducted for a Newtonian fluid of Prandtl number (Pr) 1.0. The range of Ra number was explored from 103 to 106 in a laminar region. Thermal and hydrodynamic behavior of fluid was analyzed using a differentially heated square cavity with roughness elements present on both the hot and cold wall. Neumann boundary conditions were introduced on horizontal walls with vertical walls as isothermal. The roughness elements were at the same boundary condition as corresponding walls. Computational algorithm was validated against previous benchmark studies performed with different numerical methods, and a good agreement was found to exist. Results indicate that the maximum reduction in the average heat transfer was16.66 percent at Ra number 105.Keywords: Lattice Boltzmann method, natural convection, nusselt number, rayleigh number, roughness
Procedia PDF Downloads 5305596 Predicting Football Player Performance: Integrating Data Visualization and Machine Learning
Authors: Saahith M. S., Sivakami R.
Abstract:
In the realm of football analytics, particularly focusing on predicting football player performance, the ability to forecast player success accurately is of paramount importance for teams, managers, and fans. This study introduces an elaborate examination of predicting football player performance through the integration of data visualization methods and machine learning algorithms. The research entails the compilation of an extensive dataset comprising player attributes, conducting data preprocessing, feature selection, model selection, and model training to construct predictive models. The analysis within this study will involve delving into feature significance using methodologies like Select Best and Recursive Feature Elimination (RFE) to pinpoint pertinent attributes for predicting player performance. Various machine learning algorithms, including Random Forest, Decision Tree, Linear Regression, Support Vector Regression (SVR), and Artificial Neural Networks (ANN), will be explored to develop predictive models. The evaluation of each model's performance utilizing metrics such as Mean Squared Error (MSE) and R-squared will be executed to gauge their efficacy in predicting player performance. Furthermore, this investigation will encompass a top player analysis to recognize the top-performing players based on the anticipated overall performance scores. Nationality analysis will entail scrutinizing the player distribution based on nationality and investigating potential correlations between nationality and player performance. Positional analysis will concentrate on examining the player distribution across various positions and assessing the average performance of players in each position. Age analysis will evaluate the influence of age on player performance and identify any discernible trends or patterns associated with player age groups. The primary objective is to predict a football player's overall performance accurately based on their individual attributes, leveraging data-driven insights to enrich the comprehension of player success on the field. By amalgamating data visualization and machine learning methodologies, the aim is to furnish valuable tools for teams, managers, and fans to effectively analyze and forecast player performance. This research contributes to the progression of sports analytics by showcasing the potential of machine learning in predicting football player performance and offering actionable insights for diverse stakeholders in the football industry.Keywords: football analytics, player performance prediction, data visualization, machine learning algorithms, random forest, decision tree, linear regression, support vector regression, artificial neural networks, model evaluation, top player analysis, nationality analysis, positional analysis
Procedia PDF Downloads 415595 Make Populism Great Again: Identity Crisis in Western World with a Narrative Analysis of Donald Trump's Presidential Campaign Announcement Speech
Authors: Soumi Banerjee
Abstract:
In this research paper we will go deep into understanding Benedict Anderson’s definition of the nation as an imagined community and we will analyze why and how national identities were created through long and complex processes, and how there can exist strong emotional bonds between people within an imagined community, given the fact that these people have never known each other personally, but will still feel some form of imagined unity. Such identity construction on the part of an individual or within societies are always in some sense in a state of flux as imagined communities are ever changing, which provides us with the ontological foundation for reaching on this paper. This sort of identity crisis among individuals living in the Western world, who are in search for psychological comfort and security, illustrates a possible need for spatially dislocated, ontologically insecure and vulnerable individuals to have a secure identity. To create such an identity there has to be something to build upon, which could be achieved through what may be termed as ‘homesteading’. This could in short, and in my interpretation of Kinnvall and Nesbitt’s concept, be described as a search for security that involves a search for ‘home’, where home acts as a secure place, which one can build an identity around. The next half of the paper will then look into how populism and identity have played an increasingly important role in the political elections in the so-called western democracies of the world, using the U.S. as an example. Notions of ‘us and them’, the people and the elites will be looked into and analyzed through a social constructivist theoretical lens. Here we will analyze how such narratives about identity and the nation state affects people, their personality development and identity in different ways by studying the U.S. President Donald Trump’s speeches and analyze if and how he used different identity creating narratives for gaining political and popular support. The reason to choose narrative analysis as a method in this research paper is to use the narratives as a device to understand how the perceived notions of 'us and them' can initiate huge identity crisis with a community or a nation-state. This is a relevant subject as results and developments such as rising populist rightwing movements are being felt in a number of European states, with the so-called Brexit vote in the U.K. and the election of Donald Trump as president are two of the prime examples. This paper will then attempt to argue that these mechanisms are strengthened and gaining significance in situations when humans in an economic, social or ontologically vulnerable position, imagined or otherwise, in a general and broad meaning perceive themselves to be under pressure, and a sense of insecurity is rising. These insecurities and sense of being under threat have been on the rise in many of the Western states that are otherwise usually perceived to be some of the safest, democratically stable and prosperous states in the world, which makes it of interest to study what has changed, and help provide some part of the explanation as to how creating a ‘them’ in the discourse of national identity can cause massive security crisis.Keywords: identity crisis, migration, ontological security(in), nation-states
Procedia PDF Downloads 2605594 Optimization of Traffic Agent Allocation for Minimizing Bus Rapid Transit Cost on Simplified Jakarta Network
Authors: Gloria Patricia Manurung
Abstract:
Jakarta Bus Rapid Transit (BRT) system which was established in 2009 to reduce private vehicle usage and ease the rush hour gridlock throughout the Jakarta Greater area, has failed to achieve its purpose. With gradually increasing the number of private vehicles ownership and reduced road space by the BRT lane construction, private vehicle users intuitively invade the exclusive lane of BRT, creating local traffic along the BRT network. Invaded BRT lanes costs become the same with the road network, making BRT which is supposed to be the main public transportation in the city becoming unreliable. Efforts to guard critical lanes with preventing the invasion by allocating traffic agents at several intersections have been expended, lead to the improving congestion level along the lane. Given a set of number of traffic agents, this study uses an analytical approach to finding the best deployment strategy of traffic agent on a simplified Jakarta road network in minimizing the BRT link cost which is expected to lead to the improvement of BRT system time reliability. User-equilibrium model of traffic assignment is used to reproduce the origin-destination demand flow on the network and the optimum solution conventionally can be obtained with brute force algorithm. This method’s main constraint is that traffic assignment simulation time escalates exponentially with the increase of set of agent’s number and network size. Our proposed metaheuristic and heuristic algorithms perform linear simulation time increase and result in minimized BRT cost approaching to brute force algorithm optimization. Further analysis of the overall network link cost should be performed to see the impact of traffic agent deployment to the network system.Keywords: traffic assignment, user equilibrium, greedy algorithm, optimization
Procedia PDF Downloads 2335593 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques
Authors: John Onyima, Ikechukwu Ezepue
Abstract:
Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection
Procedia PDF Downloads 3135592 An Improved Adaptive Dot-Shape Beamforming Algorithm Research on Frequency Diverse Array
Authors: Yanping Liao, Zenan Wu, Ruigang Zhao
Abstract:
Frequency diverse array (FDA) beamforming is a technology developed in recent years, and its antenna pattern has a unique angle-distance-dependent characteristic. However, the beam is always required to have strong concentration, high resolution and low sidelobe level to form the point-to-point interference in the concentrated set. In order to eliminate the angle-distance coupling of the traditional FDA and to make the beam energy more concentrated, this paper adopts a multi-carrier FDA structure based on proposed power exponential frequency offset to improve the array structure and frequency offset of the traditional FDA. The simulation results show that the beam pattern of the array can form a dot-shape beam with more concentrated energy, and its resolution and sidelobe level performance are improved. However, the covariance matrix of the signal in the traditional adaptive beamforming algorithm is estimated by the finite-time snapshot data. When the number of snapshots is limited, the algorithm has an underestimation problem, which leads to the estimation error of the covariance matrix to cause beam distortion, so that the output pattern cannot form a dot-shape beam. And it also has main lobe deviation and high sidelobe level problems in the case of limited snapshot. Aiming at these problems, an adaptive beamforming technique based on exponential correction for multi-carrier FDA is proposed to improve beamforming robustness. The steps are as follows: first, the beamforming of the multi-carrier FDA is formed under linear constrained minimum variance (LCMV) criteria. Then the eigenvalue decomposition of the covariance matrix is performed to obtain the diagonal matrix composed of the interference subspace, the noise subspace and the corresponding eigenvalues. Finally, the correction index is introduced to exponentially correct the small eigenvalues of the noise subspace, improve the divergence of small eigenvalues in the noise subspace, and improve the performance of beamforming. The theoretical analysis and simulation results show that the proposed algorithm can make the multi-carrier FDA form a dot-shape beam at limited snapshots, reduce the sidelobe level, improve the robustness of beamforming, and have better performance.Keywords: adaptive beamforming, correction index, limited snapshot, multi-carrier frequency diverse array, robust
Procedia PDF Downloads 1335591 Aerodynamic Design and Optimization of Vertical Take-Off and Landing Type Unmanned Aerial Vehicles
Authors: Enes Gunaltili, Burak Dam
Abstract:
The airplane history started with the Wright brothers' aircraft and improved day by day. With the help of this advancements, big aircrafts replace with small and unmanned air vehicles, so in this study we design this type of air vehicles. First of all, aircrafts mainly divided into two main parts in our day as a rotary and fixed wing aircrafts. The fixed wing aircraft generally use for transport, cargo, military and etc. The rotary wing aircrafts use for same area but there are some superiorities from each other. The rotary wing aircraft can take off vertically from the ground, and it can use restricted area. On the other hand, rotary wing aircrafts generally can fly lower range than fixed wing aircraft. There are one kind of aircraft consist of this two types specifications. It is named as VTOL (vertical take-off and landing) type aircraft. VTOLs are able to takeoff and land vertically and fly horizontally. The VTOL aircrafts generally can fly higher range from the rotary wings but can fly lower range from the fixed wing aircraft but it gives beneficial range between them. There are many other advantages of VTOL aircraft from the rotary and fixed wing aircraft. Because of that, VTOLs began to use for generally military, cargo, search, rescue and mapping areas. Within this framework, this study answers the question that how can we design VTOL as a small unmanned aircraft systems for search and rescue application for benefiting the advantages of fixed wing and rotary wing aircrafts by eliminating the disadvantages of them. To answer that question and design VTOL aircraft, multidisciplinary design optimizations (MDO), some theoretical terminologies, formulations, simulations and modelling systems based on CFD (Computational Fluid Dynamics) is used in same time as design methodology to determine design parameters and steps. As a conclusion, based on tests and simulations depend on design steps, suggestions on how the VTOL aircraft designed and advantages, disadvantages, and observations for design parameters are listed, then VTOL is designed and presented with the design parameters, advantages, and usage areas.Keywords: airplane, rotary, fixed, VTOL, CFD
Procedia PDF Downloads 286