Search results for: fast algorithm
916 Research on Detection of Web Page Visual Salience Region Based on Eye Tracker and Spectral Residual Model
Authors: Xiaoying Guo, Xiangyun Wang, Chunhua Jia
Abstract:
Web page has been one of the most important way of knowing the world. Humans catch a lot of information from it everyday. Thus, understanding where human looks when they surfing the web pages is rather important. In normal scenes, the down-top features and top-down tasks significantly affect humans’ eye movement. In this paper, we investigated if the conventional visual salience algorithm can properly predict humans’ visual attractive region when they viewing the web pages. First, we obtained the eye movement data when the participants viewing the web pages using an eye tracker. By the analysis of eye movement data, we studied the influence of visual saliency and thinking way on eye-movement pattern. The analysis result showed that thinking way affect human’ eye-movement pattern much more than visual saliency. Second, we compared the results of web page visual salience region extracted by Itti model and Spectral Residual (SR) model. The results showed that Spectral Residual (SR) model performs superior than Itti model by comparison with the heat map from eye movements. Considering the influence of mind habit on humans’ visual region of interest, we introduced one of the most important cue in mind habit-fixation position to improved the SR model. The result showed that the improved SR model can better predict the human visual region of interest in web pages.Keywords: web page salience region, eye-tracker, spectral residual, visual salience
Procedia PDF Downloads 272915 Improved Classification Procedure for Imbalanced and Overlapped Situations
Authors: Hankyu Lee, Seoung Bum Kim
Abstract:
The issue with imbalance and overlapping in the class distribution becomes important in various applications of data mining. The imbalanced dataset is a special case in classification problems in which the number of observations of one class (i.e., major class) heavily exceeds the number of observations of the other class (i.e., minor class). Overlapped dataset is the case where many observations are shared together between the two classes. Imbalanced and overlapped data can be frequently found in many real examples including fraud and abuse patients in healthcare, quality prediction in manufacturing, text classification, oil spill detection, remote sensing, and so on. The class imbalance and overlap problem is the challenging issue because this situation degrades the performance of most of the standard classification algorithms. In this study, we propose a classification procedure that can effectively handle imbalanced and overlapped datasets by splitting data space into three parts: nonoverlapping, light overlapping, and severe overlapping and applying the classification algorithm in each part. These three parts were determined based on the Hausdorff distance and the margin of the modified support vector machine. An experiments study was conducted to examine the properties of the proposed method and compared it with other classification algorithms. The results showed that the proposed method outperformed the competitors under various imbalanced and overlapped situations. Moreover, the applicability of the proposed method was demonstrated through the experiment with real data.Keywords: classification, imbalanced data with class overlap, split data space, support vector machine
Procedia PDF Downloads 307914 Patent on Brian: Brain Waves Stimulation
Authors: Jalil Qoulizadeh, Hasan Sadeghi
Abstract:
Brain waves are electrical wave patterns that are produced in the human brain. Knowing these waves and activating them can have a positive effect on brain function and ultimately create an ideal life. The brain has the ability to produce waves from 0.1 to above 65 Hz. (The Beta One device produces exactly these waves) This is because it is said that the waves produced by the Beta One device exactly match the waves produced by the brain. The function and method of this device is based on the magnetic stimulation of the brain. The technology used in the design and producƟon of this device works in a way to strengthen and improve the frequencies of brain waves with a pre-defined algorithm according to the type of requested function, so that the person can access the expected functions in life activities. to perform better. The effect of this field on neurons and their stimulation: In order to evaluate the effect of this field created by the device, on the neurons, the main tests are by conducting electroencephalography before and after stimulation and comparing these two baselines by qEEG or quantitative electroencephalography method using paired t-test in 39 subjects. It confirms the significant effect of this field on the change of electrical activity recorded after 30 minutes of stimulation in all subjects. The Beta One device is able to induce the appropriate pattern of the expected functions in a soft and effective way to the brain in a healthy and effective way (exactly in accordance with the harmony of brain waves), the process of brain activities first to a normal state and then to a powerful one. Production of inexpensive neuroscience equipment (compared to existing rTMS equipment) Magnetic brain stimulation for clinics - homes - factories and companies - professional sports clubs.Keywords: stimulation, brain, waves, betaOne
Procedia PDF Downloads 79913 Getting Out of the Box: Tangible Music Production in the Age of Virtual Technological Abundance
Authors: Tim Nikolsky
Abstract:
This paper seeks to explore the different ways in which music producers choose to embrace various levels of technology based on musical values, objectives, affordability, access and workflow benefits. Current digital audio production workflow is questioned. Engineers and music producers of today are increasingly divorced from the tangibility of music production. Making music no longer requires you to reach over and turn a knob. Ideas of authenticity in music production are being redefined. Calculations from the mathematical algorithm with the pretty pictures are increasingly being chosen over hardware containing transformers and tubes. Are mouse clicks and movements equivalent or inferior to the master brush strokes we are seeking to conjure? We are making audio production decisions visually by constantly looking at a screen rather than listening. Have we compromised our music objectives and values by removing the ‘hands-on’ nature of music making? DAW interfaces are making our musical decisions for us not necessarily in our best interests. Technological innovation has presented opportunities as well as challenges for education. What do music production students actually need to learn in a formalised education environment, and to what extent do they need to know it? In this brave new world of omnipresent music creation tools, do we still need tangibility in music production? Interviews with prominent Australian music producers that work in a variety of fields will be featured in this paper, and will provide insight in answering these questions and move towards developing an understanding how tangibility can be rediscovered in the next generation of music production.Keywords: analogue, digital, digital audio workstation, music production, plugins, tangibility, technology, workflow
Procedia PDF Downloads 270912 Increasing of Gain in Unstable Thin Disk Resonator
Authors: M. Asl. Dehghan, M. H. Daemi, S. Radmard, S. H. Nabavi
Abstract:
Thin disk lasers are engineered for efficient thermal cooling and exhibit superior performance for this task. However the disk thickness and large pumped area make the use of this gain format in a resonator difficult when constructing a single-mode laser. Choosing an unstable resonator design is beneficial for this purpose. On the other hand, the low gain medium restricts the application of unstable resonators to low magnifications and therefore to a poor beam quality. A promising idea to enable the application of unstable resonators to wide aperture, low gain lasers is to couple a fraction of the out coupled radiation back into the resonator. The output coupling gets dependent on the ratio of the back reflection and can be adjusted independently from the magnification. The excitation of the converging wave can be done by the use of an external reflector. The resonator performance is numerically predicted. First of all the threshold condition of linear, V and 2V shape resonator is investigated. Results show that the maximum magnification is 1.066 that is very low for high quality purposes. Inserting an additional reflector covers the low gain. The reflectivity and the related magnification of a 350 micron Yb:YAG disk are calculated. The theoretical model was based on the coupled Kirchhoff integrals and solved numerically by the Fox and Li algorithm. Results show that with back reflection mechanism in combination with increasing the number of beam incidents on disk, high gain and high magnification can occur.Keywords: unstable resonators, thin disk lasers, gain, external reflector
Procedia PDF Downloads 410911 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 263910 Coding and Decoding versus Space Diversity for Rayleigh Fading Radio Frequency Channels
Authors: Ahmed Mahmoud Ahmed Abouelmagd
Abstract:
The diversity is the usual remedy of the transmitted signal level variations (Fading phenomena) in radio frequency channels. Diversity techniques utilize two or more copies of a signal and combine those signals to combat fading. The basic concept of diversity is to transmit the signal via several independent diversity branches to get independent signal replicas via time – frequency - space - and polarization diversity domains. Coding and decoding processes can be an alternative remedy for fading phenomena, it cannot increase the channel capacity, but it can improve the error performance. In this paper we propose the use of replication decoding with BCH code class, and Viterbi decoding algorithm with convolution coding; as examples of coding and decoding processes. The results are compared to those obtained from two optimized selection space diversity techniques. The performance of Rayleigh fading channel, as the model considered for radio frequency channels, is evaluated for each case. The evaluation results show that the coding and decoding approaches, especially the BCH coding approach with replication decoding scheme, give better performance compared to that of selection space diversity optimization approaches. Also, an approach for combining the coding and decoding diversity as well as the space diversity is considered, the main disadvantage of this approach is its complexity but it yields good performance results.Keywords: Rayleigh fading, diversity, BCH codes, Replication decoding, convolution coding, viterbi decoding, space diversity
Procedia PDF Downloads 438909 Development of Precise Ephemeris Generation Module for Thaichote Satellite Operations
Authors: Manop Aorpimai, Ponthep Navakitkanok
Abstract:
In this paper, the development of the ephemeris generation module used for the Thaichote satellite operations is presented. It is a vital part of the flight dynamics system, which comprises, the orbit determination, orbit propagation, event prediction and station-keeping maneuver modules. In the generation of the spacecraft ephemeris data, the estimated orbital state vector from the orbit determination module is used as an initial condition. The equations of motion are then integrated forward in time to predict the satellite states. The higher geopotential harmonics, as well as other disturbing forces, are taken into account to resemble the environment in low-earth orbit. Using a highly accurate numerical integrator based on the Burlish-Stoer algorithm the ephemeris data can be generated for long-term predictions, by using a relatively small computation burden and short calculation time. Some events occurring during the prediction course that are related to the mission operations, such as the satellite’s rise/set viewed from the ground station, Earth and Moon eclipses, the drift in ground track as well as the drift in the local solar time of the orbital plane are all detected and reported. When combined with other modules to form a flight dynamics system, this application is aimed to be applied for the Thaichote satellite and successive Thailand’s Earth-observation missions.Keywords: flight dynamics system, orbit propagation, satellite ephemeris, Thailand’s Earth Observation Satellite
Procedia PDF Downloads 375908 An Assessment of Nodulation and Nitrogen Fixation of Lessertia Frutescens Plants Inoculated with Rhizobial Isolates from the Cape Fynbos
Authors: Mokgadi Miranda Hlongwane, Ntebogeng Sharon Mokgalaka, Felix Dapare Dakora
Abstract:
Lessertia (L.) frutescens (syn. Sutherlandia frutescens) is a leguminous medicinal plant indigenous to South Africa. Traditionally, L. frutescens has been used to treat cancer, diabetes, epilepsy, fever, HIV, stomach problems, wounds and other ailments. This legume is endemic to the Cape fynbos, with large populations occurring wild and cultivated in the Cape Florist Region. Its widespread distribution in the Western Cape, Northern Cape, Eastern Cape and Kwazulu-Natal is linked to its increased use as a phytomedicine in the treatment of various diseases by traditional healers. The frequent harvesting of field plants for use as a medicine has made it necessary to undertake studies towards the conservation of Lessertia frutescens. As a legume, this species can form root nodules and fix atmospheric N₂ when in symbiosis with soil bacteria called rhizobia. So far, however, few studies (if any) have been done on the efficacy and diversity of native bacterial symbionts nodulating L. frutescens in South Africa. The aim of this project was to isolate and characterize L. frutescens-nodulating bacteria from five different locations in the Western Cape Province. This was done by trapping soil rhizobia using rhizosphere soil suspension to inoculate L. frutescens seedlings growing in sterilized sand and receiving sterile N-free Hoagland nutrient solution under glasshouse conditions. At 60 days after planting, root nodules were harvested from L. frutescens plants, surface-sterilized, macerated, and streaked on yeast mannitol agar (YMA) plates and incubated at 28 ˚C for observation of bacterial growth. The majority of isolates were slow-growers that took 6-14 days to appear on YMA plates. However, seven isolates were fast-growers, taking 2-4 days to appear on YMA plates. Single-colony cultures of the isolates were assessed for their ability to nodulate L. frutescens as a homologous host under glasshouse conditions. Of the 92 bacterial isolates tested, 63 elicited nodule formation on L. frutescens. Symbiotic effectiveness varied markedly between and among test isolates. There were also significant (p≤0.005) differences in nodulation, shoot biomass, photosynthetic rates, leaf transpiration and stomatal conductance of L. frutescens plants inoculated with the test isolates, which is an indication of their functional diversity.Keywords: lessertia frutescens, nodulating, rhizobia, symbiotic effectiveness
Procedia PDF Downloads 191907 Souk Waqif in Old Doha, Qatar: Cultural Heritage, Urban Regeneration, and Sustainability
Authors: Djamel Boussaa
Abstract:
Cultural heritage and tourism have become during the last two decades dynamic areas of development in the world. The idea of heritage is crucial to the critical decision-making process as to how irreplaceable resources are to be utilized by people of the present or conserved for future generations in a fast changing world. In view of the importance of ‘heritage’ to the development of a tourist destination the emphasis on developing appropriate adaptive reuse strategies cannot be overemphasized. In October 1999, the 12th general assembly of the ICOMOS in Mexico stated, that in the context of sustainable development, two interrelated issues need urgent attention, cultural tourism and historic towns and cities. These two issues underscore the fact that historic resources are non-renewable, belonging to all of humanity. Without adequate adaptive reuse actions to ensure a sustainable future for these historic resources, may lead to their complete vanishing. The growth of tourism and its role in dispersing cultural heritage to everyone is developing rapidly. According to the World Tourism Organization, natural and cultural heritage resources are and will remain motivating factors for travel in the foreseeable future. According to the experts, people choose travel destinations where they can learn about traditional and distinct cultures in their historic context. The Qatar rich urban heritage is now being recognized as a valuable resource for future development. This paper discusses the role of cultural heritage and tourism in regenerating Souk Waqif, and consequently the city of Doha. Therefore, in order to use cultural heritage wisely, it will be necessary to position heritage as an essential element of sustainable development, giving particular attention to cultural heritage and tourism. The research methodology is based on an empirical survey of the situation, based on several visits, meetings and interviews with the local heritage players. The rehabilitation project initiated since 2004 will be examined and assessed. Therefore, there is potential to assess the situation and propose directions for a sustainable future to this historic landmark. Conservation for the sake of conservation appears to be an outdated concept. Many irreplaceable natural and cultural sites are being compromised because local authorities are not giving economic consideration to the value of rehabilitating such sites. The question to be raised here is 'How can cultural heritage be used wisely for tourism without compromising its social sustainability within the emerging global world?'Keywords: cultural heritage, tourism, regeneration, economy, social sustainability
Procedia PDF Downloads 418906 Introduce a New Model of Anomaly Detection in Computer Networks Using Artificial Immune Systems
Authors: Mehrshad Khosraviani, Faramarz Abbaspour Leyl Abadi
Abstract:
The fundamental component of the computer network of modern information society will be considered. These networks are connected to the network of the internet generally. Due to the fact that the primary purpose of the Internet is not designed for, in recent decades, none of these networks in many of the attacks has been very important. Today, for the provision of security, different security tools and systems, including intrusion detection systems are used in the network. A common diagnosis system based on artificial immunity, the designer, the Adhasaz Foundation has been evaluated. The idea of using artificial safety methods in the diagnosis of abnormalities in computer networks it has been stimulated in the direction of their specificity, there are safety systems are similar to the common needs of m, that is non-diagnostic. For example, such methods can be used to detect any abnormalities, a variety of attacks, being memory, learning ability, and Khodtnzimi method of artificial immune algorithm pointed out. Diagnosis of the common system of education offered in this paper using only the normal samples is required for network and any additional data about the type of attacks is not. In the proposed system of positive selection and negative selection processes, selection of samples to create a distinction between the colony of normal attack is used. Copa real data collection on the evaluation of ij indicates the proposed system in the false alarm rate is often low compared to other ir methods and the detection rate is in the variations.Keywords: artificial immune system, abnormality detection, intrusion detection, computer networks
Procedia PDF Downloads 353905 Multi-Analyte Indium Gallium Zinc Oxide-Based Dielectric Electrolyte-Insulator-Semiconductor Sensing Membranes
Authors: Chyuan Haur Kao, Hsiang Chen, Yu Sheng Tsai, Chen Hao Hung, Yu Shan Lee
Abstract:
Dielectric electrolyte-insulator-semiconductor sensing membranes-based biosensors have been intensively investigated because of their simple fabrication, low cost, and fast response. However, to enhance their sensing performance, it is worthwhile to explore alternative materials, distinct processes, and novel treatments. An ISFET can be viewed as a variation of MOSFET with the dielectric oxide layer as the sensing membrane. Then, modulation on the work function of the gate caused by electrolytes in various ion concentrations could be used to calculate the ion concentrations. Recently, owing to the advancement of CMOS technology, some high dielectric materials substrates as the sensing membranes of electrolyte-insulator-semiconductor (EIS) structures. The EIS with a stacked-layer of SiO₂ layer between the sensing membrane and the silicon substrate exhibited a high pH sensitivity and good long-term stability. IGZO is a wide-bandgap (~3.15eV) semiconductor of the III-VI semiconductor group with several preferable properties, including good transparency, high electron mobility, wide band gap, and comparable with CMOS technology. IGZO was sputtered by reactive radio frequency (RF) on a p-type silicon wafer with various gas ratios of Ar:O₂ and was treated with rapid thermal annealing in O₂ ambient. The sensing performance, including sensitivity, hysteresis, and drift rate was measured and XRD, XPS, and AFM analyses were also used to study the material properties of the IGZO membrane. Moreover, IGZO was used as a sensing membrane in dielectric EIS bio-sensor structures. In addition to traditional pH sensing capability, detection for concentrations of Na+, K+, urea, glucose, and creatinine was performed. Moreover, post rapid thermal annealing (RTA) treatment was confirmed to improve the material properties and enhance the multi-analyte sensing capability for various ions or chemicals in solutions. In this study, the IGZO sensing membrane with annealing in O₂ ambient exhibited a higher sensitivity, higher linearity, higher H+ selectivity, lower hysteresis voltage and lower drift rate. Results indicate that the IGZO dielectric sensing membrane on the EIS structure is promising for future bio-medical device applications.Keywords: dielectric sensing membrane, IGZO, hydrogen ion, plasma, rapid thermal annealing
Procedia PDF Downloads 250904 Comparison of Two Maintenance Policies for a Two-Unit Series System Considering General Repair
Authors: Seyedvahid Najafi, Viliam Makis
Abstract:
In recent years, maintenance optimization has attracted special attention due to the growth of industrial systems complexity. Maintenance costs are high for many systems, and preventive maintenance is effective when it increases operations' reliability and safety at a reduced cost. The novelty of this research is to consider general repair in the modeling of multi-unit series systems and solve the maintenance problem for such systems using the semi-Markov decision process (SMDP) framework. We propose an opportunistic maintenance policy for a series system composed of two main units. Unit 1, which is more expensive than unit 2, is subjected to condition monitoring, and its deterioration is modeled using a gamma process. Unit 1 hazard rate is estimated by the proportional hazards model (PHM), and two hazard rate control limits are considered as the thresholds of maintenance interventions for unit 1. Maintenance is performed on unit 2, considering an age control limit. The objective is to find the optimal control limits and minimize the long-run expected average cost per unit time. The proposed algorithm is applied to a numerical example to compare the effectiveness of the proposed policy (policy Ⅰ) with policy Ⅱ, which is similar to policy Ⅰ, but instead of general repair, replacement is performed. Results show that policy Ⅰ leads to lower average cost compared with policy Ⅱ.Keywords: condition-based maintenance, proportional hazards model, semi-Markov decision process, two-unit series systems
Procedia PDF Downloads 122903 The Use of Remotely Sensed Data to Extract Wetlands Area in the Cultural Park of Ahaggar, South of Algeria
Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur
Abstract:
The cultural park of the Ahaggar, occupying a large area of Algeria, is characterized by a rich wetlands area to be preserved and managed both in time and space. The management of a large area, by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information...), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Remote sensing imaging data have been very useful in the last decade in very interesting applications. They can aid in several domains such as the detection and identification of diverse wetland surface targets, topographical details, and geological features... In this work, we try to extract automatically wetlands area using multispectral remotely sensed data on-board the Earth Observing 1 (EO-1) and Landsat satellite. Both are high-resolution multispectral imager with a 30 m resolution. The instrument images an interesting surface area. We have used images acquired over the several area of interesting in the National Park of Ahaggar in the south of Algeria. An Extraction Algorithm is applied on the several spectral index obtained from combination of different spectral bands to extract wetlands fraction occupation of land use. The obtained results show an accuracy to distinguish wetlands area from the other lad use themes using a fine exploitation on spectral index.Keywords: multispectral data, EO1, landsat, wetlands, Ahaggar, Algeria
Procedia PDF Downloads 375902 Study of Durability of Porous Polymer Materials, Glass-Fiber-Reinforced Polyurethane Foam (R-PUF) in MarkIII Containment Membrane System
Authors: Florent Cerdan, Anne-Gaëlle Denay, Annette Roy, Jean-Claude Grandidier, Éric Laine
Abstract:
The insulation of MarkIII membrane of the Liquid Natural Gas Carriers (LNGC) consists of a load- bearing system made of panels in reinforced polyurethane foam (R-PUF). During the shipping, the cargo containment shall be potentially subject to risk events which can be water leakage through the wall ballast tank. The aim of these present works is to further develop understanding of water transfer mechanisms and water effect on properties of R-PUF. This multi-scale approach contributes to improve the durability. Macroscale / Mesoscale Firstly, the use of the gravimetric technique has allowed to define, at room temperature, the water transfer mechanisms and kinetic diffusion, in the R-PUF. The solubility follows a first kinetic fast growing connected to the water absorption by the micro-porosity, and then evolves linearly slowly, this second stage is connected to molecular diffusion and dissolution of water in the dense membranes polyurethane. Secondly, in the purpose of improving the understanding of the transfer mechanism, the study of the evolution of the buoyant force has been established. It allowed to identify the effect of the balance of total and partial pressure of mixture gas contained in pores surface. Mesoscale / Microscale The differential scanning calorimetry (DSC) and Dynamical Mechanical Analysis (DMA), have been used to investigate the hydration of the hard and soft segments of the polyurethane matrix. The purpose was to identify the sensitivity of these two phases. It been shown that the glass transition temperatures shifts towards the low temperatures when the solubility of the water increases. These observations permit to conclude to a plasticization of the polymer matrix. Microscale The Fourier Transform Infrared (FTIR) study has been used to investigate the characterization of functional groups on the edge, the center and mid-way of the sample according the duration of submersion. More water there is in the material, more the water fix themselves on the urethanes groups and more specifically on amide groups. The pic of C=O urethane shifts at lower frequencies quickly before 24 hours of submersion then grows slowly. The intensity of the pic decreases more flatly after that.Keywords: porous materials, water sorption, glass transition temperature, DSC, DMA, FTIR, transfer mechanisms
Procedia PDF Downloads 527901 A Pipeline for Detecting Copy Number Variation from Whole Exome Sequencing Using Comprehensive Tools
Authors: Cheng-Yang Lee, Petrus Tang, Tzu-Hao Chang
Abstract:
Copy number variations (CNVs) have played an important role in many kinds of human diseases, such as Autism, Schizophrenia and a number of cancers. Many diseases are found in genome coding regions and whole exome sequencing (WES) is a cost-effective and powerful technology in detecting variants that are enriched in exons and have potential applications in clinical setting. Although several algorithms have been developed to detect CNVs using WES and compared with other algorithms for finding the most suitable methods using their own samples, there were not consistent datasets across most of algorithms to evaluate the ability of CNV detection. On the other hand, most of algorithms is using command line interface that may greatly limit the analysis capability of many laboratories. We create a series of simulated WES datasets from UCSC hg19 chromosome 22, and then evaluate the CNV detective ability of 19 algorithms from OMICtools database using our simulated WES datasets. We compute the sensitivity, specificity and accuracy in each algorithm for validation of the exome-derived CNVs. After comparison of 19 algorithms from OMICtools database, we construct a platform to install all of the algorithms in a virtual machine like VirtualBox which can be established conveniently in local computers, and then create a simple script that can be easily to use for detecting CNVs using algorithms selected by users. We also build a table to elaborate on many kinds of events, such as input requirement, CNV detective ability, for all of the algorithms that can provide users a specification to choose optimum algorithms.Keywords: whole exome sequencing, copy number variations, omictools, pipeline
Procedia PDF Downloads 317900 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators
Authors: Andrea Bellucci, Martina Tofi
Abstract:
The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.Keywords: bancassurance, business model, non life bancassurance, insurance business value drivers
Procedia PDF Downloads 296899 Full-Field Estimation of Cyclic Threshold Shear Strain
Authors: E. E. S. Uy, T. Noda, K. Nakai, J. R. Dungca
Abstract:
Cyclic threshold shear strain is the cyclic shear strain amplitude that serves as the indicator of the development of pore water pressure. The parameter can be obtained by performing either cyclic triaxial test, shaking table test, cyclic simple shear or resonant column. In a cyclic triaxial test, other researchers install measuring devices in close proximity of the soil to measure the parameter. In this study, an attempt was made to estimate the cyclic threshold shear strain parameter using full-field measurement technique. The technique uses a camera to monitor and measure the movement of the soil. For this study, the technique was incorporated in a strain-controlled consolidated undrained cyclic triaxial test. Calibration of the camera was first performed to ensure that the camera can properly measure the deformation under cyclic loading. Its capacity to measure deformation was also investigated using a cylindrical rubber dummy. Two-dimensional image processing was implemented. Lucas and Kanade optical flow algorithm was applied to track the movement of the soil particles. Results from the full-field measurement technique were compared with the results from the linear variable displacement transducer. A range of values was determined from the estimation. This was due to the nonhomogeneous deformation of the soil observed during the cyclic loading. The minimum values were in the order of 10-2% in some areas of the specimen.Keywords: cyclic loading, cyclic threshold shear strain, full-field measurement, optical flow
Procedia PDF Downloads 233898 Multi-Objective Optimization of a Solar-Powered Triple-Effect Absorption Chiller for Air-Conditioning Applications
Authors: Ali Shirazi, Robert A. Taylor, Stephen D. White, Graham L. Morrison
Abstract:
In this paper, a detailed simulation model of a solar-powered triple-effect LiBr–H2O absorption chiller is developed to supply both cooling and heating demand of a large-scale building, aiming to reduce the fossil fuel consumption and greenhouse gas emissions in building sector. TRNSYS 17 is used to simulate the performance of the system over a typical year. A combined energetic-economic-environmental analysis is conducted to determine the system annual primary energy consumption and the total cost, which are considered as two conflicting objectives. A multi-objective optimization of the system is performed using a genetic algorithm to minimize these objectives simultaneously. The optimization results show that the final optimal design of the proposed plant has a solar fraction of 72% and leads to an annual primary energy saving of 0.69 GWh and annual CO2 emissions reduction of ~166 tonnes, as compared to a conventional HVAC system. The economics of this design, however, is not appealing without public funding, which is often the case for many renewable energy systems. The results show that a good funding policy is required in order for these technologies to achieve satisfactory payback periods within the lifetime of the plant.Keywords: economic, environmental, multi-objective optimization, solar air-conditioning, triple-effect absorption chiller
Procedia PDF Downloads 238897 Mugil cephalus Presents a Feasible Alternative To Lates calcarifer Farming in Brackishwater: Evidence From Grey Mullet Mugil Cephalus Farming in Bangladesh
Authors: Asif Hasan
Abstract:
Among the reported suitable mariculture species in Bangladesh, seabass and mullet are the two most popular candidates due to their high market values. Several field studies conducted on the culture of seabass in Bangladesh, it still remains a challenge to commercially grow this species due to its exclusive carnivorous nature. In contrast, the grey mullet (M. cephalus) is a fast-growing, omnivorous euryhaline fish that has shown excellent growth in many areas including South Asia. Choice of a sustainable aquaculture technique must consider the productivity and yield as well as their environmental suitability. This study was designed to elucidate the ecologically suitable culture technique of M. cephalus in brakishwater ponds by comparing the biotic and abiotic components of pond ecosystem. In addition to growth parameters (yield, ADG, SGR, weight gain, FCR), Physicochemical parameters (Temperature, DO, pH, salinity, TDS, transparency, ammonia, and Chlorophyll-a concentration) and biological community composition (phytoplankton, zooplankton and benthic macroinvertebrates) were investigated from ponds under Semi-intensive, Improve extensive and Traditional culture system. While temperature were similar in the three culture types, ponds under improve-extensive showed better environmental conditions with significantly higher mean DO and transparency, and lower TDS and Chlorophyll-a. The abundance of zooplankton, phytoplankton and benthic macroinvertebrates were apparently higher in semi-intensive ponds. The Analysis of Similarity (ANOSIM) suggested moderate difference in the planktonic community composition. While the fish growth parameters of M. cephalus and total yield did not differ significantly between three systems, M. cephalus yield (kg/decimal) was apparently higher in semi-intensive pond due to high stocking density and intensive feeding. The results suggested that the difference between the three systems were due to more efficient utilization of nutrients in improve extensive ponds which affected fish growth through trophic cascades. This study suggested that different culture system of M. cephalus is an alternative and more beneficial method owing to its ecological and economic benefits in brackishwater ponds.Keywords: Mugil cephalus, pond ecosystem, mariculture, fisheries management
Procedia PDF Downloads 71896 Glaucoma Detection in Retinal Tomography Using the Vision Transformer
Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan
Abstract:
Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning
Procedia PDF Downloads 189895 Efficient Frequent Itemset Mining Methods over Real-Time Spatial Big Data
Authors: Hamdi Sana, Emna Bouazizi, Sami Faiz
Abstract:
In recent years, there is a huge increase in the use of spatio-temporal applications where data and queries are continuously moving. As a result, the need to process real-time spatio-temporal data seems clear and real-time stream data management becomes a hot topic. Sliding window model and frequent itemset mining over dynamic data are the most important problems in the context of data mining. Thus, sliding window model for frequent itemset mining is a widely used model for data stream mining due to its emphasis on recent data and its bounded memory requirement. These methods use the traditional transaction-based sliding window model where the window size is based on a fixed number of transactions. Actually, this model supposes that all transactions have a constant rate which is not suited for real-time applications. And the use of this model in such applications endangers their performance. Based on these observations, this paper relaxes the notion of window size and proposes the use of a timestamp-based sliding window model. In our proposed frequent itemset mining algorithm, support conditions are used to differentiate frequents and infrequent patterns. Thereafter, a tree is developed to incrementally maintain the essential information. We evaluate our contribution. The preliminary results are quite promising.Keywords: real-time spatial big data, frequent itemset, transaction-based sliding window model, timestamp-based sliding window model, weighted frequent patterns, tree, stream query
Procedia PDF Downloads 160894 Depth-Averaged Modelling of Erosion and Sediment Transport in Free-Surface Flows
Authors: Thomas Rowan, Mohammed Seaid
Abstract:
A fast finite volume solver for multi-layered shallow water flows with mass exchange and an erodible bed is developed. This enables the user to solve a number of complex sediment-based problems including (but not limited to), dam-break over an erodible bed, recirculation currents and bed evolution as well as levy and dyke failure. This research develops methodologies crucial to the under-standing of multi-sediment fluvial mechanics and waterway design. In this model mass exchange between the layers is allowed and, in contrast to previous models, sediment and fluid are able to transfer between layers. In the current study we use a two-step finite volume method to avoid the solution of the Riemann problem. Entrainment and deposition rates are calculated for the first time in a model of this nature. In the first step the governing equations are rewritten in a non-conservative form and the intermediate solutions are calculated using the method of characteristics. In the second stage, the numerical fluxes are reconstructed in conservative form and are used to calculate a solution that satisfies the conservation property. This method is found to be considerably faster than other comparative finite volume methods, it also exhibits good shock capturing. For most entrainment and deposition equations a bed level concentration factor is used. This leads to inaccuracies in both near bed level concentration and total scour. To account for diffusion, as no vertical velocities are calculated, a capacity limited diffusion coefficient is used. The additional advantage of this multilayer approach is that there is a variation (from single layer models) in bottom layer fluid velocity: this dramatically reduces erosion, which is often overestimated in simulations of this nature using single layer flows. The model is used to simulate a standard dam break. In the dam break simulation, as expected, the number of fluid layers utilised creates variation in the resultant bed profile, with more layers offering a higher deviation in fluid velocity . These results showed a marked variation in erosion profiles from standard models. The overall the model provides new insight into the problems presented at minimal computational cost.Keywords: erosion, finite volume method, sediment transport, shallow water equations
Procedia PDF Downloads 216893 Finding the Longest Common Subsequence in Normal DNA and Disease Affected Human DNA Using Self Organizing Map
Authors: G. Tamilpavai, C. Vishnuppriya
Abstract:
Bioinformatics is an active research area which combines biological matter as well as computer science research. The longest common subsequence (LCSS) is one of the major challenges in various bioinformatics applications. The computation of the LCSS plays a vital role in biomedicine and also it is an essential task in DNA sequence analysis in genetics. It includes wide range of disease diagnosing steps. The objective of this proposed system is to find the longest common subsequence which presents in a normal and various disease affected human DNA sequence using Self Organizing Map (SOM) and LCSS. The human DNA sequence is collected from National Center for Biotechnology Information (NCBI) database. Initially, the human DNA sequence is separated as k-mer using k-mer separation rule. Mean and median values are calculated from each separated k-mer. These calculated values are fed as input to the Self Organizing Map for the purpose of clustering. Then obtained clusters are given to the Longest Common Sub Sequence (LCSS) algorithm for finding common subsequence which presents in every clusters. It returns nx(n-1)/2 subsequence for each cluster where n is number of k-mer in a specific cluster. Experimental outcomes of this proposed system produce the possible number of longest common subsequence of normal and disease affected DNA data. Thus the proposed system will be a good initiative aid for finding disease causing sequence. Finally, performance analysis is carried out for different DNA sequences. The obtained values show that the retrieval of LCSS is done in a shorter time than the existing system.Keywords: clustering, k-mers, longest common subsequence, SOM
Procedia PDF Downloads 265892 The Tourism in the Regional Development of South Caucasus
Authors: Giorgi Sulashvili, Vladimer Kekenadze, Olga Khutsishvili, Bela Khotenashvili, Tsiuri Phkhakadze, Besarion Tsikhelashvili
Abstract:
The article dealt with the South Caucasus is a complex economic policy, which consists of strands: The process of deepening economic integration in the South Caucasus region; deepening economic integration with the EU in the framework of "Neighbourhood policy with Europe" and in line with the Maastricht criteria; the development of bilateral trade and economic relations with many countries of the world community; the development of sufficient conditions for the integration of the South Caucasus region in the world to enter the market. According to the author, to determine the place of Georgia in the regional policy of the South Caucasus, it is necessary to consider two views about Georgia: The first is the view of Georgia, as a part of global economic and political processes and the second look at Georgia, as a country located in the geo-economic and geopolitical space of the South Caucasus. Such approaches reveal the place of Georgia in two dimensions; in the global and regional economies. In the countries of South Caucasus, the tourism has been developing fast and has a great social and economic importance. Tourism influences deeply on the social and economic growth of the regions of the country. Tourism development formulates thousand new jobs, fixes the positions of small and middle businesses, ensures the development of the education and culture of the population. In the countries of South Caucasus, the Tourist Industry can be specified as the intersectoral complex, which consists of travel transport and it’s technical service network, tourist enterprises which are specialized in various types, wide network services. Tourists have a chance to enjoy all of these services. At the transitional stage of shifting to the market economy, tourism is among the priorities in the development of the national economy of our country. It is true that the Georgian tourism faces a range of problems at present, but its recognition and the necessity for its development may be considered as a fact. Besides, we would underline that the revitalization of the Georgian tourism is not only the question of time. This area can bring a lot of benefits as to private firms, as to specific countries. It also has many negative effects were conducted fundamental research and studies to consider both, positive and negative impacts of tourism. In the future such decisions will be taken that will bring, the maximum benefit at minimum cost, in order for tourism to take its place in Georgia it is necessary to understand the role of the tourism sector in the economic structure.Keywords: transitional stage, national economy, Georgian tourism, positive and negative impacts
Procedia PDF Downloads 393891 Extraction of Scandium (Sc) from an Ore with Functionalized Nanoporous Silicon Adsorbent
Authors: Arezoo Rahmani, Rinez Thapa, Juha-Matti Aalto, Petri Turhanen, Jouko Vepsalainen, Vesa-PekkaLehto, Joakim Riikonen
Abstract:
Production of Scandium (Sc) is a complicated process because Sc is found only in low concentrations in ores and the concentration of Sc is very low compared with other metals. Therefore, utilization of typical extraction processes such as solvent extraction is problematic in scandium extraction. The Adsorption/desorption method can be used, but it is challenging to prepare materials, which have good selectivity, high adsorption capacity, and high stability. Therefore, efficient and environmentally friendly methods for Sc extraction are needed. In this study, the nanoporous composite material was developed for extracting Sc from an Sc ore. The nanoporous composite material offers several advantageous properties such as large surface area, high chemical and mechanical stability, fast diffusion of the metals in the material and possibility to construct a filter out of the material with good flow-through properties. The nanoporous silicon material was produced by first stabilizing the surfaces with a silicon carbide layer and then functionalizing the surface with bisphosphonates that act as metal chelators. The surface area and porosity of the material were characterized by N₂ adsorption and the morphology was studied by scanning electron microscopy (SEM). The bisphosphonate content of the material was studied by thermogravimetric analysis (TGA). The concentration of metal ions in the adsorption/desorption experiments was measured with inductively coupled plasma mass spectrometry (ICP-MS). The maximum capacity of the material was 25 µmol/g Sc at pH=1 and 45 µmol/g Sc at pH=3, obtained from adsorption isotherm. The selectivity of the material towards Sc in artificial solutions containing several metal ions was studied at pH one and pH 3. The result shows good selectivity of the nanoporous composite towards adsorption of Sc. Scandium was less efficiently adsorbed from solution leached from the ore of Sc because of excessive amounts of iron (Fe), aluminum (Al) and titanium (Ti) which disturbed the adsorption process. For example, the concentration of Fe was more than 4500 ppm, while the concentration of Sc was only three ppm, approximately 1500 times lower. Precipitation methods were developed to lower the concentration of the metals other than Sc. Optimal pH for precipitation was found to be pH 4. The concentration of Fe, Al and Ti were decreased by 99, 70, 99.6%, respectively, while the concentration of Sc decreased only 22%. Despite the large reduction in the concentration of other metals, more work is needed to further increase the relative concentration of Sc compared with other metals to efficiently extract it using the developed nanoporous composite material. Nevertheless, the developed material may provide an affordable, efficient and environmentally friendly method to extract Sc on a large scale.Keywords: adsorption, nanoporous silicon, ore solution, scandium
Procedia PDF Downloads 143890 Prospective Validation of the FibroTest Score in Assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4
Authors: G. Shiha, S. Seif, W. Samir, K. Zalata
Abstract:
Prospective Validation of the FibroTest Score in assessing Liver Fibrosis in Hepatitis C Infection with Genotype 4 FibroTest (FT) is non-invasive score of liver fibrosis that combines the quantitative results of 5 serum biochemical markers (alpha-2-macroglobulin, haptoglobin, apolipoprotein A1, gamma glutamyl transpeptidase (GGT) and bilirubin) and adjusted with the patient's age and sex in a patented algorithm to generate a measure of fibrosis. FT has been validated in patients with chronic hepatitis C (CHC) (Halfon et al., Gastroenterol. Clin Biol.( 2008), 32 6suppl 1, 22-39). The validation of fibro test ( FT) in genotype IV is not well studied. Our aim was to evaluate the performance of FibroTest in an independent prospective cohort of hepatitis C patients with genotype 4. Subject was 122 patients with CHC. All liver biopsies were scored using METAVIR system. Our fibrosis score(FT) were measured, and the performance of the cut-off score were done using ROC curve. Among patients with advanced fibrosis, the FT was identically matched with the liver biopsy in 18.6%, overestimated the stage of fibrosis in 44.2% and underestimated the stage of fibrosis in 37.7% of cases. Also in patients with no/mild fibrosis, identical matching was detected in 39.2% of cases with overestimation in 48.1% and underestimation in 12.7%. So, the overall results of the test were identical matching, overestimation and underestimation in 32%, 46.7% and 21.3% respectively. Using ROC curve it was found that (FT) at the cut-off point of 0.555 could discriminate early from advanced stages of fibrosis with an area under ROC curve (AUC) of 0.72, sensitivity of 65%, specificity of 69%, PPV of 68%, NPV of 66% and accuracy of 67%. As FibroTest Score overestimates the stage of advanced fibrosis, it should not be considered as a reliable surrogate for liver biopsy in hepatitis C infection with genotype 4.Keywords: fibrotest, chronic Hepatitis C, genotype 4, liver biopsy
Procedia PDF Downloads 413889 Optimal 3D Deployment and Path Planning of Multiple Uavs for Maximum Coverage and Autonomy
Authors: Indu Chandran, Shubham Sharma, Rohan Mehta, Vipin Kizheppatt
Abstract:
Unmanned aerial vehicles are increasingly being explored as the most promising solution to disaster monitoring, assessment, and recovery. Current relief operations heavily rely on intelligent robot swarms to capture the damage caused, provide timely rescue, and create road maps for the victims. To perform these time-critical missions, efficient path planning that ensures quick coverage of the area is vital. This study aims to develop a technically balanced approach to provide maximum coverage of the affected area in a minimum time using the optimal number of UAVs. A coverage trajectory is designed through area decomposition and task assignment. To perform efficient and autonomous coverage mission, solution to a TSP-based optimization problem using meta-heuristic approaches is designed to allocate waypoints to the UAVs of different flight capacities. The study exploits multi-agent simulations like PX4-SITL and QGroundcontrol through the ROS framework and visualizes the dynamics of UAV deployment to different search paths in a 3D Gazebo environment. Through detailed theoretical analysis and simulation tests, we illustrate the optimality and efficiency of the proposed methodologies.Keywords: area coverage, coverage path planning, heuristic algorithm, mission monitoring, optimization, task assignment, unmanned aerial vehicles
Procedia PDF Downloads 213888 Health Trajectory Clustering Using Deep Belief Networks
Authors: Farshid Hajati, Federico Girosi, Shima Ghassempour
Abstract:
We present a Deep Belief Network (DBN) method for clustering health trajectories. Deep Belief Network (DBN) is a deep architecture that consists of a stack of Restricted Boltzmann Machines (RBM). In a deep architecture, each layer learns more complex features than the past layers. The proposed method depends on DBN in clustering without using back propagation learning algorithm. The proposed DBN has a better a performance compared to the deep neural network due the initialization of the connecting weights. We use Contrastive Divergence (CD) method for training the RBMs which increases the performance of the network. The performance of the proposed method is evaluated extensively on the Health and Retirement Study (HRS) database. The University of Michigan Health and Retirement Study (HRS) is a nationally representative longitudinal study that has surveyed more than 27,000 elderly and near-elderly Americans since its inception in 1992. Participants are interviewed every two years and they collect data on physical and mental health, insurance coverage, financial status, family support systems, labor market status, and retirement planning. The dataset is publicly available and we use the RAND HRS version L, which is easy to use and cleaned up version of the data. The size of sample data set is 268 and the length of the trajectories is equal to 10. The trajectories do not stop when the patient dies and represent 10 different interviews of live patients. Compared to the state-of-the-art benchmarks, the experimental results show the effectiveness and superiority of the proposed method in clustering health trajectories.Keywords: health trajectory, clustering, deep learning, DBN
Procedia PDF Downloads 368887 Hepatoprotective Action of Emblica officinalis Linn. against Radiation and Lead Induced Changes in Swiss Albino Mice
Authors: R. K. Purohit
Abstract:
Ionizing radiation induces cellular damage through direct ionization of DNA and other cellular targets and indirectly via reactive oxygen species which may include effects from epigenetic changes. So there is a need of hour is to search for an ideal radioprotector which could minimize the deleterious and damaging effects caused by ionizing radiation. Radioprotectors are agents which reduce the radiation effects on cell when applied prior to exposure of radiation. The aim of this study was to access the efficacy of Emblica officinalis in reducing radiation and lead induced changes in mice liver. For the present experiment, healthy male Swiss albino mice (6-8 weeks) were selected and maintained under standard conditions of temperature and light. Fruit extract of Emblica was fed orally at the dose of 0.01 ml/animal/day. The animal were divided into seven groups according to the treatment i.e. lead acetate solution as drinking water (group-II) or exposed to 3.5 or 7.0 Gy gamma radiation (group-III) or combined treatment of radiation and lead acetate (group-IV). The animals of experimental groups were administered Emblica extract seven days prior to radiation or lead acetate treatment (group V, VI and VII) respectively. The animals from all the groups were sacrificed by cervical dislocation at each post-treatment intervals of 1, 2, 4, 7, 14 and 28 days. After sacrificing the animals pieces of liver were taken out and some of them were kept at -20°C for different biochemical parameters. The histopathological changes included cytoplasmic degranulation, vacuolation, hyperaemia, pycnotic and crenated nuclei. The changes observed in the control groups were compared with the respective experimental groups. An increase in the value of total proteins, glycogen, acid phosphtase, alkaline phosphatase activity and RNA was observed up to day-14 in the non drug treated group and day 7 in the Emblica treated groups, thereafter value declined up to day-28 without reaching to normal. The value of cholesterol and DNA showed a decreasing trend up to day -14 in non drug treated groups and day-7 in drug treated groups, thereafter value elevated up to day-28. The biochemical parameters were observed in the form of increase or decrease in the values. The changes were found dose dependent. After combined treatment of radiation and lead acetate synergistic effect were observed. The liver of Emblica treated animals exhibited less severe damage as compared to non-drug treated animals at all the corresponding intervals. An early and fast recovery was also noticed in Emblica pretreated animals. Thus, it appears that Emblica is potent enough to check lead and radiation induced heptic lesion in Swiss albino mice.Keywords: radiation, lead , emblica, mice, liver
Procedia PDF Downloads 319