Search results for: large airplane
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6835

Search results for: large airplane

6565 Healthcare Data Mining Innovations

Authors: Eugenia Jilinguirian

Abstract:

In the healthcare industry, data mining is essential since it transforms the field by collecting useful data from large datasets. Data mining is the process of applying advanced analytical methods to large patient records and medical histories in order to identify patterns, correlations, and trends. Healthcare professionals can improve diagnosis accuracy, uncover hidden linkages, and predict disease outcomes by carefully examining these statistics. Additionally, data mining supports personalized medicine by personalizing treatment according to the unique attributes of each patient. This proactive strategy helps allocate resources more efficiently, enhances patient care, and streamlines operations. However, to effectively apply data mining, however, and ensure the use of private healthcare information, issues like data privacy and security must be carefully considered. Data mining continues to be vital for searching for more effective, efficient, and individualized healthcare solutions as technology evolves.

Keywords: data mining, healthcare, big data, individualised healthcare, healthcare solutions, database

Procedia PDF Downloads 39
6564 A Comprehensive Methodology for Voice Segmentation of Large Sets of Speech Files Recorded in Naturalistic Environments

Authors: Ana Londral, Burcu Demiray, Marcus Cheetham

Abstract:

Speech recording is a methodology used in many different studies related to cognitive and behaviour research. Modern advances in digital equipment brought the possibility of continuously recording hours of speech in naturalistic environments and building rich sets of sound files. Speech analysis can then extract from these files multiple features for different scopes of research in Language and Communication. However, tools for analysing a large set of sound files and automatically extract relevant features from these files are often inaccessible to researchers that are not familiar with programming languages. Manual analysis is a common alternative, with a high time and efficiency cost. In the analysis of long sound files, the first step is the voice segmentation, i.e. to detect and label segments containing speech. We present a comprehensive methodology aiming to support researchers on voice segmentation, as the first step for data analysis of a big set of sound files. Praat, an open source software, is suggested as a tool to run a voice detection algorithm, label segments and files and extract other quantitative features on a structure of folders containing a large number of sound files. We present the validation of our methodology with a set of 5000 sound files that were collected in the daily life of a group of voluntary participants with age over 65. A smartphone device was used to collect sound using the Electronically Activated Recorder (EAR): an app programmed to record 30-second sound samples that were randomly distributed throughout the day. Results demonstrated that automatic segmentation and labelling of files containing speech segments was 74% faster when compared to a manual analysis performed with two independent coders. Furthermore, the methodology presented allows manual adjustments of voiced segments with visualisation of the sound signal and the automatic extraction of quantitative information on speech. In conclusion, we propose a comprehensive methodology for voice segmentation, to be used by researchers that have to work with large sets of sound files and are not familiar with programming tools.

Keywords: automatic speech analysis, behavior analysis, naturalistic environments, voice segmentation

Procedia PDF Downloads 258
6563 Development of Forging Technology of Cam Ring Gear for Truck Using Small Bar

Authors: D. H. Park, Y. H. Tak, H. H. Kwon, G. J. Kwon, H. G. Kim

Abstract:

This study focused on developing forging technology of a large-diameter cam ring gear from the small bar. The analyses of temperature variation and deformation behavior of the material are important to obtain the optimal forging products. The hot compression test was carried out to know formability at high temperature. In order to define the optimum forging conditions including material temperature, strain and forging load, the finite element method was used to simulate the forging process of cam ring gear parts. Test results were in good agreement with the simulations. An existing cam ring gear is presented the chips generated by cutting the rod material and the durability issues, but this would be to develop a large-diameter cam ring gear forging parts for truck in order to solve the durability problem and the material waste.

Keywords: forging technology, cam ring, gear, truck, small bar

Procedia PDF Downloads 263
6562 Hybrid Intelligent Optimization Methods for Optimal Design of Horizontal-Axis Wind Turbine Blades

Authors: E. Tandis, E. Assareh

Abstract:

Designing the optimal shape of MW wind turbine blades is provided in a number of cases through evolutionary algorithms associated with mathematical modeling (Blade Element Momentum Theory). Evolutionary algorithms, among the optimization methods, enjoy many advantages, particularly in stability. However, they usually need a large number of function evaluations. Since there are a large number of local extremes, the optimization method has to find the global extreme accurately. The present paper introduces a new population-based hybrid algorithm called Genetic-Based Bees Algorithm (GBBA). This algorithm is meant to design the optimal shape for MW wind turbine blades. The current method employs crossover and neighborhood searching operators taken from the respective Genetic Algorithm (GA) and Bees Algorithm (BA) to provide a method with good performance in accuracy and speed convergence. Different blade designs, twenty-one to be exact, were considered based on the chord length, twist angle and tip speed ratio using GA results. They were compared with BA and GBBA optimum design results targeting the power coefficient and solidity. The results suggest that the final shape, obtained by the proposed hybrid algorithm, performs better compared to either BA or GA. Furthermore, the accuracy and speed convergence increases when the GBBA is employed

Keywords: Blade Design, Optimization, Genetic Algorithm, Bees Algorithm, Genetic-Based Bees Algorithm, Large Wind Turbine

Procedia PDF Downloads 282
6561 Vibration Imaging Method for Vibrating Objects with Translation

Authors: Kohei Shimasaki, Tomoaki Okamura, Idaku Ishii

Abstract:

We propose a vibration imaging method for high frame rate (HFR)-video-based localization of vibrating objects with large translations. When the ratio of the translation speed of a target to its vibration frequency is large, obtaining its frequency response in image intensities becomes difficult because one or no waves are observable at the same pixel. Our method can precisely localize moving objects with vibration by virtually translating multiple image sequences for pixel-level short-time Fourier transform to observe multiple waves at the same pixel. The effectiveness of the proposed method is demonstrated by analyzing several HFR videos of flying insects in real scenarios.

Keywords: HFR video analysis, pixel-level vibration source localization, short-time Fourier transform, virtual translation

Procedia PDF Downloads 82
6560 Novel Coprocessor for DNA Sequence Alignment in Resequencing Applications

Authors: Atef Ibrahim, Hamed Elsimary, Abdullah Aljumah, Fayez Gebali

Abstract:

This paper presents a novel semi-systolic array architecture for an optimized parallel sequence alignment algorithm. This architecture has the advantage that it can be modified to be reused for multiple pass processing in order to increase the number of processing elements that can be packed into a single FPGA and to increase the number of sequences that can be aligned in parallel in a single FPGA. This resolves the potential problem of many FPGA resources left unused for designs that have large values of short read length. When using the previously published conventional hardware design. FPGA implementation results show that, for large values of short read lengths (M>128), the proposed design has a slightly higher speed up and FPGA utilization over the the conventional one.

Keywords: bioinformatics, genome sequence alignment, re-sequencing applications, systolic array

Procedia PDF Downloads 495
6559 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 270
6558 Energy-Level Structure of a Confined Electron-Positron Pair in Nanostructure

Authors: Tokuei Sako, Paul-Antoine Hervieux

Abstract:

The energy-level structure of a pair of electron and positron confined in a quasi-one-dimensional nano-scale potential well has been investigated focusing on its trend in the small limit of confinement strength ω, namely, the Wigner molecular regime. An anisotropic Gaussian-type basis functions supplemented by high angular momentum functions as large as l = 19 has been used to obtain reliable full configuration interaction (FCI) wave functions. The resultant energy spectrum shows a band structure characterized by ω for the large ω regime whereas for the small ω regime it shows an energy-level pattern dominated by excitation into the in-phase motion of the two particles. The observed trend has been rationalized on the basis of the nodal patterns of the FCI wave functions.

Keywords: confined systems, positron, wave function, Wigner molecule, quantum dots

Procedia PDF Downloads 358
6557 An Efficient Book Keeping Strategy for the Formation of the Design Matrix in Geodetic Network Adjustment

Authors: O. G. Omogunloye, J. B. Olaleye, O. E. Abiodun, J. O. Odumosu, O. G. Ajayi

Abstract:

The focus of the study is to proffer easy formulation and computation of least square observation equation’s design matrix by using an efficient book keeping strategy. Usually, for a large network of many triangles and stations, a rigorous task is involved in the computation and placement of the values of the differentials of each observation with respect to its station coordinates (latitude and longitude), in their respective rows and columns. The efficient book keeping strategy seeks to eliminate or reduce this rigorous task involved, especially in large network, by simple skillful arrangement and development of a short program written in the Matlab environment, the formulation and computation of least square observation equation’s design matrix can be easily achieved.

Keywords: design, differential, geodetic, matrix, network, station

Procedia PDF Downloads 315
6556 Exploring Spin Reorientation Transition and Berry Curvature Driven Anomalous Hall Effect in Quasi-2D vdW Ferromagnet Fe4GeTe2

Authors: Satyabrata Bera, Mintu Mondal

Abstract:

Two-dimensional (2D) ferromagnetic materials have garnered significant attention due to their potential to host intriguing scientific phenomena such as the anomalous Hall effect, anomalous Nernst effect, and high transport spin polarization. This study focuses on the investigation of air-stable van der Waals(vdW) ferromagnets, FeGeTe₂ (FₙGT with n = 3, 4, and 5). Particular emphasis is placed on the Fe4GeTe2 (F4GT) compound, which exhibits a complex and fascinating magnetic behavior characterized by two distinct transitions: (i) paramagnetic (PM) to ferromagnetic (FM) around T C ∼ 270 K, and (ii) another spins reorientation transition (SRT) at T SRT ∼ 100 K . Scaling analysis of magnetocaloric effect confirms the second-order character of the ferromagnetic transition, while the same analysis at T SRT suggests that SRT is first-order phase transition. Moreover, the F4GT exhibits a large anomalous Hall conductivity (AHC), ∼ 490 S/cm at 2 K . The near-quadratic behavior of the anomalous Hall resistivity with the longitudinal resistivity suggests that a dominant AHC contribution arises from an intrinsic Berry curvature (BC) mechanism. Electronic structure calculations reveal a significant BC resulting from SOC-induced gapped nodal lines around the Fermi level, thereby giving rise to large AHC. Additionally, we reported exceptionally large anomalous Hall angle (≃ 10.6%) and Hall factor (≃ 0.22 V −1 ) values, the largest observed within this vdW family. The findings presented here, provide valuable insights into the fascinating magnetic and transport properties of 2D ferromagnetic materials, in particular, FₙGT family.

Keywords: 2D vdW ferromagnet, spin reorientation transition, anomalous hall effect, berry curvature

Procedia PDF Downloads 45
6555 In vitro Clonal Multiplication and Acclimatization of Large Cardamom (Amomum subulatum Roxb.)

Authors: Krishna Poudel, Tahar Katuwal, Sujan Karki

Abstract:

A rapid propagation and acclimatization method of large cardamom was optimized in this study. Sprouted rhizome buds were collected. The excised rhizome bud explants were cultured on semi solid culture media. The explants were cultured on Murashige and Skoog’s (MS) medium supplemented with different concentration and combinations of BAP (6-Benzyl-amino-purine) and IBA (Indole-3-butyric acid) for shoot and root induction. Explants cultured on MS basal medium supplemented with 1.0 mg/l BAP + 0.5 gm/l IBA showed the highest rate of shoot multiplication. In vitro shoots were rooted on to the half-strength MS basal media supplemented with 0.5 mg/l IBA. Rooted shoots were transplanted in the screen house for hardening process. These hardened plants were subsequently shifted into the netted nursery for further multiplication process.

Keywords: concentration, explants, hardening, rhizome

Procedia PDF Downloads 214
6554 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 226
6553 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider

Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón

Abstract:

The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.

Keywords: AD0, ALICE, DCS, LHC

Procedia PDF Downloads 275
6552 Dynamic Analysis of Double Deck Tunnel

Authors: C. W. Kwak, I. J. Park, D. I. Jang

Abstract:

The importance of cost-wise effective application and construction is getting increase due to the surge of traffic volume in the metropolitan cities. Accordingly, the necessity of the tunnel has large section becomes more critical. Double deck tunnel can be one of the most appropriate solutions to the necessity. The dynamic stability of double deck tunnel is essential against seismic load since it has large section and connection between perimeter lining and interim slab. In this study, 3-dimensional dynamic numerical analysis was conducted based on the Finite Difference Method to investigate the seismic behavior of double deck tunnel. Seismic joint for dynamic stability and the mitigation of seismic impact on the lining was considered in the modeling and analysis. Consequently, the mitigation of acceleration, lining displacement and stress were verified successfully.

Keywords: double deck tunnel, interim slab, 3-dimensional dynamic numerical analysis, seismic joint

Procedia PDF Downloads 358
6551 Large Eddy Simulations for Flow Blurring Twin-Fluid Atomization Concept Using Volume of Fluid Method

Authors: Raju Murugan, Pankaj S. Kolhe

Abstract:

The present study is mainly focusing on the numerical simulation of Flow Blurring (FB) twin fluid injection concept was proposed by Ganan-Calvo, which involves back flow atomization based on global bifurcation of liquid and gas streams, thus creating two-phase flow near the injector exit. The interesting feature of FB injector spray is an insignificant effect of variation in atomizing air to liquid ratio (ALR) on a spray cone angle. Besides, FB injectors produce a nearly uniform spatial distribution of mean droplet diameter and are least susceptible to variation in thermo-physical properties of fuels, making it a perfect candidate for fuel flexible combustor development. The FB injector working principle has been realized through experimental flow visualization techniques only. The present study explores potential of ANSYS Fluent based Large Eddy Simulation(LES) with volume of fluid (VOF) method to investigate two-phase flow just upstream of injector dump plane and spray quality immediate downstream of injector dump plane. Note that, water and air represent liquid and gas phase in all simulations and ALR is varied by changing the air mass flow rate alone. Preliminary results capture two phase flow just upstream of injector dump plane and qualitative agreement is observed with the available experimental literature.

Keywords: flow blurring twin fluid atomization, large eddy simulation, volume of fluid, air to liquid ratio

Procedia PDF Downloads 187
6550 Semiconductor Device of Tapered Waveguide for Broadband Optical Communications

Authors: Keita Iwai, Isao Tomita

Abstract:

To expand the optical spectrum for use in broadband optical communications, we study the properties of a semiconductor waveguide device with a tapered structure including its third-order optical nonlinearity. Spectral-broadened output by the tapered structure has the potential to create a compact, built-in device for optical communications. Here we deal with a compound semiconductor waveguide, the material of which is the same as that of laser diodes used in the communication systems, i.e., InₓGa₁₋ₓAsᵧP₁₋ᵧ, which has large optical nonlinearity. We confirm that our structure widens the output spectrum sufficiently by controlling its taper form factor while utilizing the large nonlinear refraction of InₓGa₁₋ₓAsᵧP₁₋ᵧ. We also examine the taper effect for nonlinear optical loss.

Keywords: InₓGa₁₋ₓAsᵧP₁₋ᵧ, waveguide, nonlinear refraction, spectral spreading, taper device

Procedia PDF Downloads 123
6549 Design, Construction And Validation Of A Simple, Low-cost Phi Meter

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

The use of a phi meter allows for definition of equivalence ratio during a fire test. Previous phi meter designs have used expensive catalysts and had restricted portability due to the large furnace and requirement for pure oxygen. The new design of the phi meter did not require the use of a catalyst. The furnace design was based on the existing micro-scale combustion calorimetry (MCC) furnace and operating conditions based on the secondary oxidizer furnace used in the steady state tube furnace (SSTF). Preliminary tests were conducted to study the effects of varying furnace temperatures on combustion efficiency. The SSTF was chosen to validate the phi meter measurements as it can both pre-set and independently quantify the equivalence ratio during a test. The data were in agreement with the data obtained on the SSTF. It was also validated by a comparison of CO2 yields obtained from the SSTF oxidizer and those obtained by the phi meter. The phi meter designed and constructed in this work was proven to work effectively on a bench-scale. The phi meter was then used to measure the equivalence ratio on a series of large-scale ISO 9705 tests for numerous fire conditions. The materials used were a range of non-homogenous materials such as polyurethane. The measurements corresponded accurately to the data collected, showing the novel design can be used from bench to large-scale tests to measure equivalence ratio. This cheaper, more portable, safer and easier to use phi meter design will enable more widespread use and the ability to quantify fire conditions of tests, allowing for better understanding of flammability and smoke toxicity.

Keywords: phi meter, smoke toxicity, fire condition, ISO9705, novel equipment

Procedia PDF Downloads 77
6548 Stochastic Control of Decentralized Singularly Perturbed Systems

Authors: Walid S. Alfuhaid, Saud A. Alghamdi, John M. Watkins, M. Edwin Sawan

Abstract:

Designing a controller for stochastic decentralized interconnected large scale systems usually involves a high degree of complexity and computation ability. Noise, observability, and controllability of all system states, connectivity, and channel bandwidth are other constraints to design procedures for distributed large scale systems. The quasi-steady state model investigated in this paper is a reduced order model of the original system using singular perturbation techniques. This paper results in an optimal control synthesis to design an observer based feedback controller by standard stochastic control theory techniques using Linear Quadratic Gaussian (LQG) approach and Kalman filter design with less complexity and computation requirements. Numerical example is given at the end to demonstrate the efficiency of the proposed method.

Keywords: decentralized, optimal control, output, singular perturb

Procedia PDF Downloads 337
6547 Multitemporal Satellite Images for Agriculture Change Detection in Al Jouf Region, Saudi Arabia

Authors: Ali A. Aldosari

Abstract:

Change detection of Earth surface features is extremely important for better understanding of our environment in order to promote better decision making. Al-Jawf is remarkable for its abundant agricultural water where there is fertile agricultural land due largely to underground water. As result, this region has large areas of cultivation of dates, olives and fruits trees as well as other agricultural products such as Alfa Alfa and wheat. However this agricultural area was declined due to the reduction of government supports in the last decade. This reduction was not officially recorded or measured in this region at large scale or governorate level. Remote sensing data are primary sources extensively used for change detection in agriculture applications. This study is applied the technology of GIS and used the Normalized Difference Vegetation Index (NDVI) which can be used to measure and analyze the spatial and temporal changes in the agriculture areas in the Aljouf region.

Keywords: spatial analysis, geographical information system, change detection

Procedia PDF Downloads 374
6546 Using the Cluster Computing to Improve the Computational Speed of the Modular Exponentiation in RSA Cryptography System

Authors: Te-Jen Chang, Ping-Sheng Huang, Shan-Ten Cheng, Chih-Lin Lin, I-Hui Pan, Tsung- Hsien Lin

Abstract:

RSA system is a great contribution for the encryption and the decryption. It is based on the modular exponentiation. We call this system as “a large of numbers for calculation”. The operation of a large of numbers is a very heavy burden for CPU. For increasing the computational speed, in addition to improve these algorithms, such as the binary method, the sliding window method, the addition chain method, and so on, the cluster computer can be used to advance computational speed. The cluster system is composed of the computers which are installed the MPICH2 in laboratory. The parallel procedures of the modular exponentiation can be processed by combining the sliding window method with the addition chain method. It will significantly reduce the computational time of the modular exponentiation whose digits are more than 512 bits and even more than 1024 bits.

Keywords: cluster system, modular exponentiation, sliding window, addition chain

Procedia PDF Downloads 497
6545 Comparison of Fuel Cell Installation Methods at Large Commercial and Industrial Sites

Authors: Masood Sattari

Abstract:

Using fuel cell technology to generate electricity for large commercial and industrial sites is a growing segment in the fuel cell industry. The installation of these systems involves design, permitting, procurement of long-lead electrical equipment, and construction involving multiple utilities. The installation of each fuel cell system requires the same amount of coordination as the construction of a new structure requiring a foundation, gas, water, and electricity. Each of these components provide variables that can delay and possibly eliminate a new project. As the manufacturing process and efficiency of fuel cell systems improves, so must the installation methods to prevent a ‘bottle-neck’ in the installation phase of the deployment. Installation methodologies to install the systems vary among companies and this paper will examine the methodologies, describe the benefits and drawbacks for each, and provide guideline for the industry to improve overall installation efficiency.

Keywords: construction, installation, methodology, procurement

Procedia PDF Downloads 164
6544 Three Dimensional Large Eddy Simulation of Blood Flow and Deformation in an Elastic Constricted Artery

Authors: Xi Gu, Guan Heng Yeoh, Victoria Timchenko

Abstract:

In the current work, a three-dimensional geometry of a 75% stenosed blood vessel is analysed. Large eddy simulation (LES) with the help of a dynamic subgrid scale Smagorinsky model is applied to model the turbulent pulsatile flow. The geometry, the transmural pressure and the properties of the blood and the elastic boundary were based on clinical measurement data. For the flexible wall model, a thin solid region is constructed around the 75% stenosed blood vessel. The deformation of this solid region was modelled as a deforming boundary to reduce the computational cost of the solid model. Fluid-structure interaction is realised via a two-way coupling between the blood flow modelled via LES and the deforming vessel. The information of the flow pressure and the wall motion was exchanged continually during the cycle by an arbitrary lagrangian-eulerian method. The boundary condition of current time step depended on previous solutions. The fluctuation of the velocity in the post-stenotic region was analysed in the study. The axial velocity at normalised position Z=0.5 shows a negative value near the vessel wall. The displacement of the elastic boundary was concerned in this study. In particular, the wall displacement at the systole and the diastole were compared. The negative displacement at the stenosis indicates a collapse at the maximum velocity and the deceleration phase.

Keywords: Large Eddy Simulation, Fluid Structural Interaction, constricted artery, Computational Fluid Dynamics

Procedia PDF Downloads 266
6543 Load Balancing and Resource Utilization in Cloud Computing

Authors: Gagandeep Kaur

Abstract:

Cloud computing uses various computing resources such as CPU, memory, processor etc. which is used to deliver service over the network and is one of the emerging fields for large scale distributed computing. In cloud computing, execution of large number of tasks with available resources to achieve high performance, minimal total time for completion, minimum response time, effective utilization of resources etc. are the major research areas. In the proposed research, an algorithm has been proposed to achieve high performance in load balancing and resource utilization. The proposed algorithm is used to reduce the makespan, increase the resource utilization and performance cost for independent tasks. Further scheduling metrics based on algorithm in cloud computing has been proposed.

Keywords: resource utilization, response time, load balancing, performance cost

Procedia PDF Downloads 156
6542 Challenges and Opportunities: One Stop Processing for the Automation of Indonesian Large-Scale Topographic Base Map Using Airborne LiDAR Data

Authors: Elyta Widyaningrum

Abstract:

The LiDAR data acquisition has been recognizable as one of the fastest solution to provide the basis data for topographic base mapping in Indonesia. The challenges to accelerate the provision of large-scale topographic base maps as a development plan basis gives the opportunity to implement the automated scheme in the map production process. The one stop processing will also contribute to accelerate the map provision especially to conform with the Indonesian fundamental spatial data catalog derived from ISO 19110 and geospatial database integration. Thus, the automated LiDAR classification, DTM generation and feature extraction will be conducted in one GIS-software environment to form all layers of topographic base maps. The quality of automated topographic base map will be assessed and analyzed based on its completeness, correctness, contiguity, consistency and possible customization.

Keywords: automation, GIS environment, LiDAR processing, map quality

Procedia PDF Downloads 339
6541 Issues and Challenges of Tribals in India: A Case of Andhra Pradesh

Authors: P. Lalitha

Abstract:

Economic and social empowerment and educational upliftment of socially disadvantaged groups and marginalized sections of society is necessary for achieving faster and more inclusive development. Programmes are being implemented through states, government’s apex corporations, and NGOs for the up-liftment of disadvantaged and marginalized sections of society. As per the primary data collected, a majority of tribal land holdings (60%) are below 2 hectare and only 5% are above 10 hectares. However, the ownership of large holdings does not give a distinct advantage unless the land is of good quality. There are areas in which even large holdings beyond 5 hectares are not sufficient to meet the food necessity of the tribal families all-round the year. Some initiatives e.g. grain-golas, jhum cultivation, wadi project, Joint Forest Management(JFM), enhancing Livelihood and Health through Traditional Knowledge Management, Associating Individual Rural Volunteers (IRVs) in SHG Bank Linkage Programme have been taken in various tribal areas of the country.

Keywords: tribals, unemployment, health, food

Procedia PDF Downloads 255
6540 Immunostimulant from Biodiversity to Enhance Shrimp Survival against Vibriosis

Authors: Frank Alexis, Jenny Antonia Rodriguez Leon, Cristobal Leonardo Dominguez Borbor, Mery Rosario Ramirez Munoz

Abstract:

The shrimp industry has increased in the last years to the point of becoming one of the most dynamic industries. However, the appearance of diseases that significantly affect the production of shrimps has been an obstacle for the shrimp industry. We hypothesized that natural fibers from biodiversity can stimulate the immune system to prevent shrimp diseases like vibriosis. In this project, we extracted the fibers from vegetal sources in Ecuador and characterized them using common techniques like XRD, SEM, and then we tested the effect of fibers as immunostimulants for shrimps in-vitro and in-vivo using small aquarium and large pools. Our results demonstrate that vegetal fibers can significantly increase the survival of shrimps. Moreover, the production of shrimps in a large pool was significantly increased. Lastly, the test of color and taste successfully surpass the control group of shrimps not treated with fiber food supplements.

Keywords: fibers, immunostimulant, shrimp, vibriosis

Procedia PDF Downloads 128
6539 Exploring Teachers’ Beliefs about Diagnostic Language Assessment Practices in a Large-Scale Assessment Program

Authors: Oluwaseun Ijiwade, Chris Davison, Kelvin Gregory

Abstract:

In Australia, like other parts of the world, the debate on how to enhance teachers using assessment data to inform teaching and learning of English as an Additional Language (EAL, Australia) or English as a Foreign Language (EFL, United States) have occupied the centre of academic scholarship. Traditionally, this approach was conceptualised as ‘Formative Assessment’ and, in recent times, ‘Assessment for Learning (AfL)’. The central problem is that teacher-made tests are limited in providing data that can inform teaching and learning due to variability of classroom assessments, which are hindered by teachers’ characteristics and assessment literacy. To address this concern, scholars in language education and testing have proposed a uniformed large-scale computer-based assessment program to meet the needs of teachers and promote AfL in language education. In Australia, for instance, the Victoria state government commissioned a large-scale project called 'Tools to Enhance Assessment Literacy (TEAL) for Teachers of English as an additional language'. As part of the TEAL project, a tool called ‘Reading and Vocabulary assessment for English as an Additional Language (RVEAL)’, as a diagnostic language assessment (DLA), was developed by language experts at the University of New South Wales for teachers in Victorian schools to guide EAL pedagogy in the classroom. Therefore, this study aims to provide qualitative evidence for understanding beliefs about the diagnostic language assessment (DLA) among EAL teachers in primary and secondary schools in Victoria, Australia. To realize this goal, this study raises the following questions: (a) How do teachers use large-scale assessment data for diagnostic purposes? (b) What skills do language teachers think are necessary for using assessment data for instruction in the classroom? and (c) What factors, if any, contribute to teachers’ beliefs about diagnostic assessment in a large-scale assessment? Semi-structured interview method was used to collect data from at least 15 professional teachers who were selected through a purposeful sampling. The findings from the resulting data analysis (thematic analysis) provide an understanding of teachers’ beliefs about DLA in a classroom context and identify how these beliefs are crystallised in language teachers. The discussion shows how the findings can be used to inform professional development processes for language teachers as well as informing important factor of teacher cognition in the pedagogic processes of language assessment. This, hopefully, will help test developers and testing organisations to align the outcome of this study with their test development processes to design assessment that can enhance AfL in language education.

Keywords: beliefs, diagnostic language assessment, English as an additional language, teacher cognition

Procedia PDF Downloads 170
6538 A Network of Nouns and Their Features :A Neurocomputational Study

Authors: Skiker Kaoutar, Mounir Maouene

Abstract:

Neuroimaging studies indicate that a large fronto-parieto-temporal network support nouns and their features, with some areas store semantic knowledge (visual, auditory, olfactory, gustatory,…), other areas store lexical representation and other areas are implicated in general semantic processing. However, it is not well understood how this fronto-parieto-temporal network can be modulated by different semantic tasks and different semantic relations between nouns. In this study, we combine a behavioral semantic network, functional MRI studies involving object’s related nouns and brain network studies to explain how different semantic tasks and different semantic relations between nouns can modulate the activity within the brain network of nouns and their features. We first describe how nouns and their features form a large scale brain network. For this end, we examine the connectivities between areas recruited during the processing of nouns to know which configurations of interaction areas are possible. We can thus identify if, for example, brain areas that store semantic knowledge communicate via functional/structural links with areas that store lexical representations. Second, we examine how this network is modulated by different semantic tasks involving nouns and finally, we examine how category specific activation may result from the semantic relations among nouns. The results indicate that brain network of nouns and their features is highly modulated and flexible by different semantic tasks and semantic relations. At the end, this study can be used as a guide to help neurosientifics to interpret the pattern of fMRI activations detected in the semantic processing of nouns. Specifically; this study can help to interpret the category specific activations observed extensively in a large number of neuroimaging studies and clinical studies.

Keywords: nouns, features, network, category specificity

Procedia PDF Downloads 490
6537 A Model Suggestion on Competitiveness and Sustainability of SMEs in Developing Countries

Authors: Ahmet Diken, Tahsin Karabulut

Abstract:

The factor which developing countries are in need is capital. Such countries make an effort to increase their income in order to meet their expenses for employment, infrastructure, superstructure investments, education, health and defense. The sole income of the countries is taxes collected from businesses. The businesses should drive profit and return in order to be able to toll. In a world where competition exists, different strategies may be followed by business in developing countries and they must specify their target markets. İn order to minimize cost and maximize profit, SMEs have to concentrate on target markets and select cost oriented strategy. In this study, a theoretical model is suggested that SME firms have to act as cluster between each other, and also must be optimal provider for large scale firms. SMEs’ policy must be supported by public. This relationship can benefit large scale firms to have brand over the world, and this organization increases value added for developing countries.

Keywords: competitiveness, countries, SMEs developing, sustainability

Procedia PDF Downloads 281
6536 Optimization of Extraction Conditions and Characteristics of Scale collagen From Sardine: Sardina pilchardus

Authors: F. Bellali, M. Kharroubi, M. Loutfi, N.Bourhim

Abstract:

In Morocco, fish processing industry is an important source income for a large amount of byproducts including skins, bones, heads, guts and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Scales from Sardina plichardus resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic and bio medical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. Moreover, the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The basic principle of RSM is to determinate model equations that describe interrelations between the independent variables and the dependent variables.

Keywords: Sardina pilchardus, scales, valorization, collagen extraction, response surface methodology

Procedia PDF Downloads 386