Search results for: feature method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20002

Search results for: feature method

16582 A Firefly Based Optimization Technique for Optimal Planning of Voltage Controlled Distributed Generators

Authors: M. M. Othman, Walid El-Khattam, Y. G. Hegazy, A. Y. Abdelaziz

Abstract:

This paper presents a method for finding the optimal location and capacity of dispatchable DGs connected to the distribution feeders for optimal planning for a specified power loss without violating the system practical constraints. The distributed generation units in the proposed algorithm is modeled as voltage controlled node with the flexibility to be converted to constant power node in case of reactive power limit violation. The proposed algorithm is implemented in MATLAB and tested on the IEEE 37-nodes feeder. The results that are validated by comparing it with results obtained from other competing methods show the effectiveness, accuracy and speed of the proposed method.

Keywords: distributed generators, firefly technique, optimization, power loss

Procedia PDF Downloads 533
16581 Digital Phase Shifting Holography in a Non-Linear Interferometer using Undetected Photons

Authors: Sebastian Töpfer, Marta Gilaberte Basset, Jorge Fuenzalida, Fabian Steinlechner, Juan P. Torres, Markus Gräfe

Abstract:

This work introduces a combination of digital phase-shifting holography with a non-linear interferometer using undetected photons. Non-linear interferometers can be used in combination with a measurement scheme called quantum imaging with undetected photons, which allows for the separation of the wavelengths used for sampling an object and detecting it in the imaging sensor. This method recently faced increasing attention, as it allows to use of exotic wavelengths (e.g., mid-infrared, ultraviolet) for object interaction while at the same time keeping the detection in spectral areas with highly developed, comparable low-cost imaging sensors. The object information, including its transmission and phase influence, is recorded in the form of an interferometric pattern. To collect these, this work combines the method of quantum imaging with undetected photons with digital phase-shifting holography with a minimal sampling of the interference. With this, the quantum imaging scheme gets extended in its measurement capabilities and brings it one step closer to application. Quantum imaging with undetected photons uses correlated photons generated by spontaneous parametric down-conversion in a non-linear interferometer to create indistinguishable photon pairs, which leads to an effect called induced coherence without induced emission. Placing an object inside changes the interferometric pattern depending on the object’s properties. Digital phase-shifting holography records multiple images of the interference with determined phase shifts to reconstruct the complete interference shape, which can afterward be used to analyze the changes introduced by the object and conclude its properties. An extensive characterization of this method was done using a proof-of-principle setup. The measured spatial resolution, phase accuracy, and transmission accuracy are compared for different combinations of camera exposure times and the number of interference sampling steps. The current limits of this method are shown to allow further improvements. To summarize, this work presents an alternative holographic measurement method using non-linear interferometers in combination with quantum imaging to enable new ways of measuring and motivating continuing research.

Keywords: digital holography, quantum imaging, quantum holography, quantum metrology

Procedia PDF Downloads 92
16580 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine

Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang

Abstract:

Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.

Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing

Procedia PDF Downloads 204
16579 Literature and the Extremism: Case Study on and Qualitative Analysis of the Impact of Literature on Extremism in Afghanistan

Authors: Mohibullah Zegham

Abstract:

In conducting a case study to analyze the impact of literature on extremism and fundamentalism in Afghanistan, the author of this paper uses qualitative research method. For this purpose the author of the paper has a glance at the history of extremism and fundamentalism in Afghanistan, as well the major causes and predisposing factors of it; then analyzes the impact of literature on extremism and fundamentalism using qualitative method. This study relies on the moral engagement theory to reveal how some extreme-Islamists quit the ideological interpretation of Islam and return to normal life by reading certain literary works. The goal of this case study is to help fighting extremism and fundamentalism by using literature. The research showed that literary works are useful in this regard and there are several evidences of its effectiveness.

Keywords: extremism, fundamentalism, communist, jihad, madrasa, literature

Procedia PDF Downloads 274
16578 Solubility Enhancement of Poorly Soluble Anticancer Drug, Docetaxel Using a Novel Polymer, Soluplus via Solid Dispersion Technique

Authors: Adinarayana Gorajana, Venkata Srikanth Meka, Sanjay Garg, Lim Sue May

Abstract:

This study was designed to evaluate and enhance the solubility of poorly soluble drug, docetaxel through solid dispersion (SD) technique prepared using freeze drying method. Docetaxel solid dispersions were formulated with Soluplus in different weight ratios. Freeze drying method was used to prepare the solid dispersions. Solubility of the solid dispersions were evaluated respectively and the optimized of drug-solubilizers ratio systems were characterized with different analytical methods like Differential scanning calorimeter (DSC), Scanning electron microscopy (SEM) and Fourier transform infrared spectroscopy (FTIR) to confirm the formation of complexes between drug and solubilizers. The solubility data revealed an overall improvement in solubility for all SD formulations. The ternary combination 1:5:2 gave the highest increase in solubility that is approximately 3 folds from the pure drug, suggesting the optimum drug-solubilizers ratio system. This data corresponds with the DSC and SEM analyses, which demonstrates presence of drug in amorphous state and the dispersion in the solubilizers in molecular level. The solubility of the poorly soluble drug, docetaxel was enhanced through preparation of solid dispersion formulations employing freeze drying method. Solid dispersion with multiple carrier system shows better solubility compared to single carrier system.

Keywords: docetaxel, freeze drying, soluplus, solid dispersion technique

Procedia PDF Downloads 503
16577 Feasibility Study of Wind Energy Potential in Turkey: Case Study of Catalca District in Istanbul

Authors: Mohammed Wadi, Bedri Kekezoglu, Mustafa Baysal, Mehmet Rida Tur, Abdulfetah Shobole

Abstract:

This paper investigates the technical evaluation of the wind potential for present and future investments in Turkey taking into account the feasibility of sites, installments, operation, and maintenance. This evaluation based on the hourly measured wind speed data for the three years 2008–2010 at 30 m height for Çatalca district. These data were obtained from national meteorology station in Istanbul–Republic of Turkey are analyzed in order to evaluate the feasibility of wind power potential and to assure supreme assortment of wind turbines installing for the area of interest. Furthermore, the data are extrapolated and analyzed at 60 m and 80 m regarding the variability of roughness factor. Weibull bi-parameter probability function is used to approximate monthly and annually wind potential and power density based on three calculation methods namely, the approximated, the graphical and the energy pattern factor methods. The annual mean wind power densities were to be 400.31, 540.08 and 611.02 W/m² for 30, 60, and 80 m heights respectively. Simulation results prove that the analyzed area is an appropriate place for constructing large-scale wind farms.

Keywords: wind potential in Turkey, Weibull bi-parameter probability function, the approximated method, the graphical method, the energy pattern factor method, capacity factor

Procedia PDF Downloads 259
16576 Review and Evaluation of Trending Canonical Correlation Analyses-Based Brain Computer Interface Methods

Authors: Bayar Shahab

Abstract:

The fast development of technology that has advanced neuroscience and human interaction with computers has enabled solutions to various problems, and issues of this new era have been found and are being found like no other time in history. Brain-computer interface so-called BCI has opened the door to several new research areas and have been able to provide solutions to critical and important issues such as supporting a paralyzed patient to interact with the outside world, controlling a robot arm, playing games in VR with the brain, driving a wheelchair or even a car and neurotechnology enabled the rehabilitation of the lost memory, etc. This review work presents state-of-the-art methods and improvements of canonical correlation analyses (CCA), which is an SSVEP-based BCI method. These are the methods used to extract EEG signal features or, to be said in a different way, the features of interest that we are looking for in the EEG analyses. Each of the methods from oldest to newest has been discussed while comparing their advantages and disadvantages. This would create a great context and help researchers to understand the most state-of-the-art methods available in this field with their pros and cons, along with their mathematical representations and usage. This work makes a vital contribution to the existing field of study. It differs from other similar recently published works by providing the following: (1) stating most of the prominent methods used in this field in a hierarchical way (2) explaining pros and cons of each method and their performance (3) presenting the gaps that exist at the end of each method that can open the understanding and doors to new research and/or improvements.

Keywords: BCI, CCA, SSVEP, EEG

Procedia PDF Downloads 145
16575 Combination Method Cold Plasma and Liquid Threads

Authors: Nino Tsamalaidze

Abstract:

Cold plasma is an ionized neutral gas with a temperature of 30-40 degrees, but the impact of HP includes not only gas, but also active molecules, charged particles, heat and UV radiation of low power The main goal of the technology we describe is to launch the natural function of skin regeneration and improve the metabolism inside, which leads to a huge effect of rejuvenation. In particular: eliminate fine mimic wrinkles; get rid of wrinkles around the mouth (purse-string wrinkles); reduce the overhang of the upper eyelid; eliminate bags under the eyes; provide a lifting effect on the oval of the face; reduce stretch marks; shrink pores; even out the skin, reduce the appearance of acne, scars; remove pigmentation. A clear indication of the major findings of the study is based on the current patients practice. The method is to use combination of cold plasma and liquid threats. The advantage of cold plasma is undoubtedly its efficiency, the result of its implementation can be compared with the result of a surgical facelift, despite the fact that the procedure is non-invasive and the risks are minimized. Another advantage is that the technique can be applied on the most sensitive skin of the face - these are the eyelids and the space around the eyes. Cold plasma is one of the few techniques that eliminates bags under the eyes and overhanging eyelids, while not violating the integrity of the tissues. In addition to rejuvenation and lifting effect, among the benefits of cold plasma is also getting rid of scars, kuperoze, stretch marks and other skin defects, plasma allows to get rid of acne, seborrhea, skin fungus and even heals ulcers. The cold plasma method makes it possible to achieve a result similar to blepharoplasty. Carried out on the skin of the eyelids, the procedure allows non-surgical correction of the eyelid line in 3-4 sessions. One of the undoubted advantages of this method is a short rehabilitation and rapid healing of the skin.

Keywords: wrinkles, telangiectasia, pigmentation, pore closing

Procedia PDF Downloads 84
16574 A New Approach to the Digital Implementation of Analog Controllers for a Power System Control

Authors: G. Shabib, Esam H. Abd-Elhameed, G. Magdy

Abstract:

In this paper, a comparison of discrete time PID, PSS controllers is presented through small signal stability of power system comprising of one machine connected to infinite bus system. This comparison achieved by using a new approach of discretization which converts the S-domain model of analog controllers to a Z-domain model to enhance the damping of a single machine power system. The new method utilizes the Plant Input Mapping (PIM) algorithm. The proposed algorithm is stable for any sampling rate, as well as it takes the closed loop characteristic into consideration. On the other hand, the traditional discretization methods such as Tustin’s method is produce satisfactory results only; when the sampling period is sufficiently low.

Keywords: PSS, power system stabilizer PID, proportional-integral-derivative PIM, plant input mapping

Procedia PDF Downloads 505
16573 Screening of Ionic Liquids for Hydrogen Sulfide Removal Using COSMO-RS

Authors: Zulaika Mohd Khasiran

Abstract:

The capability of ionic liquids in various applications makes them attracted by many researchers. They have potential to be developed as “green” solvents for gas separation, especially H2S gas. In this work, it is attempted to predict the solubility of hydrogen sulfide (H2S) in ILs by COSMO-RS method. Since H2S is a toxic pollutant, it is difficult to work on it in the laboratory, therefore an appropriate model will be necessary in prior work. The COSMO-RS method is implemented to predict the Henry’s law constants and activity coefficient of H2S in 140 ILs with various combinations of cations and anions. It is found by the screening that more H2S can be absorbed in ILs with [Cl] and [Ac] anion. The solubility of H2S in ILs with different alkyl chain at the cations not much affected and with different type of cations are slightly influence H2S capture capacities. Even though the cations do not affect much in solubility of H2S, we still need to consider the effectiveness of cation in different way. The prediction results only show their physical absorption ability, but the absorption of H2S need to be consider chemically to get high capacity of absorption of H2S.

Keywords: H2S, hydrogen sulfide, ionic liquids, COSMO-RS

Procedia PDF Downloads 139
16572 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements

Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria

Abstract:

The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.

Keywords: strut and tie, optimization, reinforcement, massive structure

Procedia PDF Downloads 141
16571 The Effectiveness of Teaching Emotional Intelligence on Reducing Marital Conflicts and Marital Adjustment in Married Students of Tehran University

Authors: Elham Jafari

Abstract:

The aim of this study was to evaluate the effectiveness of emotional intelligence training on reducing marital conflict and marital adjustment in married students of the University of Tehran. This research is an applied type in terms of purpose and a semi-experimental design of pre-test-post-test type with the control group and with follow-up test in terms of the data collection method. The statistical population of the present study consisted of all married students of the University of Tehran. In this study, 30 married students of the University of Tehran were selected by convenience sampling method as a sample that 15 people in the experimental group and 15 people in the control group were randomly selected. The method of data collection in this research was field and library. The data collection tool in the field section was two questionnaires of marital conflict and marital adjustment. To analyze the collected data, first at the descriptive level, using statistical indicators, the demographic characteristics of the sample were described by SPSS software. In inferential statistics, the statistical method used was the test of analysis of covariance. The results showed that the effect of the independent variable of emotional intelligence on the reduction of marital conflicts is statistically significant. And it can be inferred that emotional intelligence training has reduced the marital conflicts of married students of the University of Tehran in the experimental group compared to the control group. Also, the effect of the independent variable of emotional intelligence on marital adjustment was statistically significant. It can be inferred that emotional intelligence training has adjusted the marital adjustment of married students of the University of Tehran in the experimental group compared to the control group.

Keywords: emotional intelligence, marital conflicts, marital compatibility, married students

Procedia PDF Downloads 251
16570 Detecting Geographically Dispersed Overlay Communities Using Community Networks

Authors: Madhushi Bandara, Dharshana Kasthurirathna, Danaja Maldeniya, Mahendra Piraveenan

Abstract:

Community detection is an extremely useful technique in understanding the structure and function of a social network. Louvain algorithm, which is based on Newman-Girman modularity optimization technique, is extensively used as a computationally efficient method extract the communities in social networks. It has been suggested that the nodes that are in close geographical proximity have a higher tendency of forming communities. Variants of the Newman-Girman modularity measure such as dist-modularity try to normalize the effect of geographical proximity to extract geographically dispersed communities, at the expense of losing the information about the geographically proximate communities. In this work, we propose a method to extract geographically dispersed communities while preserving the information about the geographically proximate communities, by analyzing the ‘community network’, where the centroids of communities would be considered as network nodes. We suggest that the inter-community link strengths, which are normalized over the community sizes, may be used to identify and extract the ‘overlay communities’. The overlay communities would have relatively higher link strengths, despite being relatively apart in their spatial distribution. We apply this method to the Gowalla online social network, which contains the geographical signatures of its users, and identify the overlay communities within it.

Keywords: social networks, community detection, modularity optimization, geographically dispersed communities

Procedia PDF Downloads 235
16569 Surface Roughness of Al-Si/10% AlN MMC Material in Milling Operation Using the Taguchi Method

Authors: M. S. Said, J. A. Ghani, Izzati Osman, Z. A. Latiff, S. A .F. Syed Mohd

Abstract:

Metal matrix composites have demand for light-weight structural and functional materials. MMCs have been shown to offer improvements in strength, rigidity, temperature stability, wear resistance, reliability and control of physical properties such as density and coefficient of thermal expansion, thereby providing improved engineering performance in comparison to the un-reinforced matrix. Experiment were conducted at various cutting speed, feed rate and difference cutting tools according to Taguchi method using a standard orthogonal array L9. The volume of AlN reinforced particle was 10% in MMC. The milling process was carried out under dry cutting condition using uncoated carbide, TiN and TiCN tool insert. The parameters used were the cutting speed of (230,300,370 m/min) the federate used were (0.4, 0.6, 0.8 mm/tooth) while the depth of cut is constant (0.3 mm). The tool diameter is 20mm. From the project, the surface roughness mechanism was investigated in detail using Mitutoyo portable surface roughness measurements surftest SJ-310. This machining will be fabricated on MMC with 150mm length, 100mm width and 30mm thick. The results showed using S/N ratio, concluded that a combination of low cutting speed, medium feed rate and uncoated insert give a remarkable surface finish. From the ANOVA result showed the feed rate was major contributing factor (43.76%) following type of insert (40.89%).

Keywords: MMC, milling operation and surface roughness, Taguchi method

Procedia PDF Downloads 529
16568 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 145
16567 Post-occupancy Evaluation of Greenway Based on Multi-source data : A Case Study of Jincheng Greenway in Chengdu

Authors: Qin Zhu

Abstract:

Under the development concept of Park City, Tianfu Greenway system, as the basic and pre-configuration element of Chengdu Global Park construction, connects urban open space with linear and circular structures and undertakes and exerts the ecological, cultural and recreational functions of the park system. Chengdu greenway construction is in full swing. In the process of greenway planning and construction, the landscape effect of greenway on urban quality improvement is more valued, and the long-term impact of crowd experience on the sustainable development of greenway is often ignored. Therefore, it is very important to test the effectiveness of greenway construction from the perspective of users. Taking Jincheng Greenway in Chengdu as an example, this paper attempts to introduce multi-source data to construct a post-occupancy evaluation model of greenway and adopts behavior mapping method, questionnaire survey method, web text analysis and IPA analysis method to comprehensively evaluate the user 's behavior characteristics and satisfaction. According to the evaluation results, we can grasp the actual behavior rules and comprehensive needs of users so that the experience of building greenways can be fed back in time and provide guidance for the optimization and improvement of built greenways and the planning and construction of future greenways.

Keywords: multi-source data, greenway, IPA analysis, post -occupancy evaluation (POE)

Procedia PDF Downloads 61
16566 The Developments Trend of Islamic Inscriptions in the Building Portals of Dezfoul City

Authors: Mahnoush Mahmoudi, Ali Chaeedeh

Abstract:

In the architecture of Iranian traditional houses, the ornamentations available in the inscriptions of houses entrance portal express the identity of architects and personality of houses owners and are rooted in their religious and national beliefs and faiths. The main hypothesis of this research is changing the physique and application of religious contents in compliance with the thoughts and beliefs of people in Dezfoul historical city in the epigraphs of houses entrance portals. The objective of this study is reviewing the development trend of texts, concepts and physique of inscriptions as well as analyzing the factors effective on the quality and diversity of application of inscriptions. The present research is an applied study and descriptive-analytical method has been applied, and the data was collected by library and survey studies. The population of this research includes historical houses, houses damaged in war (Iran & Iraq) and renovated and new tissue and new-built houses of Dezfoul, from Qajar era so far. Random sampling method has been applied in this study and dispersal area includes the city. Data analysis method in this study is qualitative and quantitative. The results of this study indicate that today the inscriptions available in the entrance portal of houses in Dezfoul comparing to inscriptions in Qajar1 and Pahlavi2 era is very simple and has lower aesthetic value. One of the causes for such superficial and contextual gap between inscriptions seems to be the war and renovations during and after destruction.

Keywords: architecture, islamic architecture, reconstruction, epigraph, inscription, entrance portal, Dezfoul

Procedia PDF Downloads 245
16565 Electronically Controlled Motorized Steering System (E-Mo Steer)

Authors: M. Prasanth, V. Nithin, R. Keerthana, S.Kalyani

Abstract:

In the current scenario, the steering system in automobiles is such that the motion from the steering wheel is transferred to driving wheel by mechanical linkages. In this paper, we propose a method to design a steering mechanism using servomotors to turn the wheels instead of linkages. In this method, a steering angle sensor senses the turn angle of the steering wheel and its output is processed by an electronical control module. Then the ECM compares the angle value to that of a standard value from a look-up database. Then it gives the appropriate input power and the turning duration to the motors. Correspondingly, the motors turn the wheels by means of bevel gears welded to both the motor output shafts and the wheel hubs. Thus, the wheels are turned without the complicated framework of linkages, reducing the driver’s effort and fatigue considerably.

Keywords: electronic control unit, linkage-less steering, servomotors, E-Mo Steer

Procedia PDF Downloads 263
16564 Robotic Arm Control with Neural Networks Using Genetic Algorithm Optimization Approach

Authors: Arbnor Pajaziti, Hasan Cana

Abstract:

In this paper, the structural genetic algorithm is used to optimize the neural network to control the joint movements of robotic arm. The robotic arm has also been modeled in 3D and simulated in real-time in MATLAB. It is found that Neural Networks provide a simple and effective way to control the robot tasks. Computer simulation examples are given to illustrate the significance of this method. By combining Genetic Algorithm optimization method and Neural Networks for the given robotic arm with 5 D.O.F. the obtained the results shown that the base joint movements overshooting time without controller was about 0.5 seconds, while with Neural Network controller (optimized with Genetic Algorithm) was about 0.2 seconds, and the population size of 150 gave best results.

Keywords: robotic arm, neural network, genetic algorithm, optimization

Procedia PDF Downloads 523
16563 Numerical Solution to Coupled Heat and Moisture Diffusion in Bio-Sourced Composite Materials

Authors: Mnasri Faiza, El Ganaoui Mohammed, Khelifa Mourad, Gabsi Slimane

Abstract:

The main objective of this paper is to describe the hydrothermal behavior through porous material of construction due to temperature gradient. The construction proposed a bi-layer structure which composed of two different materials. The first is a bio-sourced panel named IBS-AKU (inertia system building), the second is the Neopor material. This system (IBS-AKU Neopor) is developed by a Belgium company (Isohabitat). The study suggests a multi-layer structure of the IBS-AKU panel in one dimension. A numerical method was proposed afterwards, by using the finite element method and a refined mesh area to strong gradients. The evolution of temperature fields and the moisture content has been processed.

Keywords: heat transfer, moisture diffusion, porous media, composite IBS-AKU, simulation

Procedia PDF Downloads 506
16562 Plastic Pipe Defect Detection Using Nonlinear Acoustic Modulation

Authors: Gigih Priyandoko, Mohd Fairusham Ghazali, Tan Siew Fun

Abstract:

This paper discusses about the defect detection of plastic pipe by using nonlinear acoustic wave modulation method. It is a sensitive method for damage detection and it is based on the propagation of high frequency acoustic waves in plastic pipe with low frequency excitation. The plastic pipe is excited simultaneously with a slow amplitude modulated vibration pumping wave and a constant amplitude probing wave. The frequency of both the excitation signals coincides with the resonances of the plastic pipe. A PVP pipe is used as the specimen as it is commonly used for the conveyance of liquid in many fields. The results obtained are being observed and the difference between uncracked specimen and cracked specimen can be distinguished clearly.

Keywords: plastic pipe, defect detection, nonlinear acoustic modulation, excitation

Procedia PDF Downloads 451
16561 An Intelligent Search and Retrieval System for Mining Clinical Data Repositories Based on Computational Imaging Markers and Genomic Expression Signatures for Investigative Research and Decision Support

Authors: David J. Foran, Nhan Do, Samuel Ajjarapu, Wenjin Chen, Tahsin Kurc, Joel H. Saltz

Abstract:

The large-scale data and computational requirements of investigators throughout the clinical and research communities demand an informatics infrastructure that supports both existing and new investigative and translational projects in a robust, secure environment. In some subspecialties of medicine and research, the capacity to generate data has outpaced the methods and technology used to aggregate, organize, access, and reliably retrieve this information. Leading health care centers now recognize the utility of establishing an enterprise-wide, clinical data warehouse. The primary benefits that can be realized through such efforts include cost savings, efficient tracking of outcomes, advanced clinical decision support, improved prognostic accuracy, and more reliable clinical trials matching. The overarching objective of the work presented here is the development and implementation of a flexible Intelligent Retrieval and Interrogation System (IRIS) that exploits the combined use of computational imaging, genomics, and data-mining capabilities to facilitate clinical assessments and translational research in oncology. The proposed System includes a multi-modal, Clinical & Research Data Warehouse (CRDW) that is tightly integrated with a suite of computational and machine-learning tools to provide insight into the underlying tumor characteristics that are not be apparent by human inspection alone. A key distinguishing feature of the System is a configurable Extract, Transform and Load (ETL) interface that enables it to adapt to different clinical and research data environments. This project is motivated by the growing emphasis on establishing Learning Health Systems in which cyclical hypothesis generation and evidence evaluation become integral to improving the quality of patient care. To facilitate iterative prototyping and optimization of the algorithms and workflows for the System, the team has already implemented a fully functional Warehouse that can reliably aggregate information originating from multiple data sources including EHR’s, Clinical Trial Management Systems, Tumor Registries, Biospecimen Repositories, Radiology PAC systems, Digital Pathology archives, Unstructured Clinical Documents, and Next Generation Sequencing services. The System enables physicians to systematically mine and review the molecular, genomic, image-based, and correlated clinical information about patient tumors individually or as part of large cohorts to identify patterns that may influence treatment decisions and outcomes. The CRDW core system has facilitated peer-reviewed publications and funded projects, including an NIH-sponsored collaboration to enhance the cancer registries in Georgia, Kentucky, New Jersey, and New York, with machine-learning based classifications and quantitative pathomics, feature sets. The CRDW has also resulted in a collaboration with the Massachusetts Veterans Epidemiology Research and Information Center (MAVERIC) at the U.S. Department of Veterans Affairs to develop algorithms and workflows to automate the analysis of lung adenocarcinoma. Those studies showed that combining computational nuclear signatures with traditional WHO criteria through the use of deep convolutional neural networks (CNNs) led to improved discrimination among tumor growth patterns. The team has also leveraged the Warehouse to support studies to investigate the potential of utilizing a combination of genomic and computational imaging signatures to characterize prostate cancer. The results of those studies show that integrating image biomarkers with genomic pathway scores is more strongly correlated with disease recurrence than using standard clinical markers.

Keywords: clinical data warehouse, decision support, data-mining, intelligent databases, machine-learning.

Procedia PDF Downloads 127
16560 An Overbooking Model for Car Rental Service with Different Types of Cars

Authors: Naragain Phumchusri, Kittitach Pongpairoj

Abstract:

Overbooking is a very useful revenue management technique that could help reduce costs caused by either undersales or oversales. In this paper, we propose an overbooking model for two types of cars that can minimize the total cost for car rental service. With two types of cars, there is an upgrade possibility for lower type to upper type. This makes the model more complex than one type of cars scenario. We have found that convexity can be proved in this case. Sensitivity analysis of the parameters is conducted to observe the effects of relevant parameters on the optimal solution. Model simplification is proposed using multiple linear regression analysis, which can help estimate the optimal overbooking level using appropriate independent variables. The results show that the overbooking level from multiple linear regression model is relatively close to the optimal solution (with the adjusted R-squared value of at least 72.8%). To evaluate the performance of the proposed model, the total cost was compared with the case where the decision maker uses a naïve method for the overbooking level. It was found that the total cost from optimal solution is only 0.5 to 1 percent (on average) lower than the cost from regression model, while it is approximately 67% lower than the cost obtained by the naïve method. It indicates that our proposed simplification method using regression analysis can effectively perform in estimating the overbooking level.

Keywords: overbooking, car rental industry, revenue management, stochastic model

Procedia PDF Downloads 172
16559 Separation of Composites for Recycling: Measurement of Electrostatic Charge of Carbon and Glass Fiber Particles

Authors: J. Thirunavukkarasu, M. Poulet, T. Turner, S. Pickering

Abstract:

Composite waste from manufacturing can consist of different fiber materials, including blends of different fiber. Commercially, the recycling of composite waste is currently limited to carbon fiber waste and recycling glass fiber waste is currently not economically viable due to the low cost of virgin glass fiber and the reduced mechanical properties of the recovered fibers. For this reason, the recycling of hybrid fiber materials, where carbon fiber is combined with a proportion of glass fiber, cannot be processed economically. Therefore, a separation method is required to remove the glass fiber materials during the recycling process. An electrostatic separation method is chosen for this work because of the significant difference between carbon and glass fiber electrical properties. In this study, an experimental rig has been developed to measure the electrostatic charge achievable as the materials are passed through a tube. A range of particle lengths (80-100 µm, 6 mm and 12 mm), surface state conditions (0%SA, 2%SA and 6%SA), and several tube wall materials have been studied. A polytetrafluoroethylene (PTFE) tube and recycled without sizing agent was identified as the most suitable parameters for the electrical separation method. It was also found that shorter fiber lengths helped to encourage particle flow and attain higher charge values. These findings can be used to develop a separation process to enable the cost-effective recycling of hybrid fiber composite waste.

Keywords: electrostatic charging, hybrid fiber composites, recycling, short fiber composites

Procedia PDF Downloads 129
16558 Query Task Modulator: A Computerized Experimentation System to Study Media-Multitasking Behavior

Authors: Premjit K. Sanjram, Gagan Jakhotiya, Apoorv Goyal, Shanu Shukla

Abstract:

In psychological research, laboratory experiments often face the trade-off issue between experimental control and mundane realism. With the advent of Immersive Virtual Environment Technology (IVET), this issue seems to be at bay. However there is a growing challenge within the IVET itself to design and develop system or software that captures the psychological phenomenon of everyday lives. One such phenomena that is of growing interest is ‘media-multitasking’ To aid laboratory researches in media-multitasking this paper introduces Query Task Modulator (QTM), a computerized experimentation system to study media-multitasking behavior in a controlled laboratory environment. The system provides a computerized platform in conducting an experiment for experimenters to study media-multitasking in which participants will be involved in a query task. The system has Instant Messaging, E-mail, and Voice Call features. The answers to queries are provided on the left hand side information panel where participants have to search for it and feed the information in the respective communication media blocks as fast as possible. On the whole the system will collect multitasking behavioral data. To analyze performance there is a separate output table that records the reaction times and responses of the participants individually. Information panel and all the media blocks will appear on a single window in order to ensure multi-modality feature in media-multitasking and equal emphasis on all the tasks (thus avoiding prioritization to a particular task). The paper discusses the development of QTM in the light of current techniques of studying media-multitasking.

Keywords: experimentation system, human performance, media-multitasking, query-task

Procedia PDF Downloads 557
16557 Vibration of Nonhomogeneous Timoshenko Nanobeam Resting on Winkler-Pasternak Foundation

Authors: Somnath Karmakar, S. Chakraverty

Abstract:

This work investigates the vibration of nonhomogeneous Timoshenko nanobeam resting on the Winkler-Pasternak foundation. Eringen’s nonlocal theory has been used to investigate small-scale effects. The Differential Quadrature method is used to obtain the frequency parameters with various classical boundary conditions. The nonhomogeneous beam model has been considered, where Young’s modulus and density of the beam material vary linearly and quadratically. Convergence of frequency parameters is also discussed. The influence of mechanical properties and scaling parameters on vibration frequencies are investigated for different boundary conditions.

Keywords: Timoshenko beam, Eringen's nonlocal theory, differential quadrature method, nonhomogeneous nanobeam

Procedia PDF Downloads 115
16556 Temperature-Responsive Shape Memory Polymer Filament Integrated Smart Polyester Knitted Fabric Featuring Memory Behavior

Authors: Priyanka Gupta, Bipin Kumar

Abstract:

Recent developments in smart materials motivate researchers to create novel textile products for innovative and functional applications, which have several potential uses beyond the conventional. This study investigates the memory behavior of shape memory filaments integrated into a knitted textile structure. The research advances the knowledge of how these intelligent materials respond within textile structures. This integration may also open new avenues for developing smart fabrics with unique sensing and actuation capabilities. A shape memory filament and polyester yarn were knitted to produce a shape memory knitted fabric (SMF). Thermo-mechanical tensile test was carried out to quantify the memory behavior of SMF under different conditions. The experimental findings demonstrate excellent shape recovery (100%) and shape fixity up to 88% at different strains (20% and 60%) and temperatures (30 ℃ and 50 ℃). Experimental results reveal that memory filament behaves differently in a fabric structure than in its pristine condition at various temperatures and strains. The cycle test of SMF under different thermo-mechanical conditions indicated complete shape recovery with an increase in shape fixity. So, the utterly recoverable textile structure was achieved after a few initial cycles. These intelligent textiles are beneficial for the development of novel, innovative, and functional fabrics like elegant curtains, pressure garments, compression stockings, etc. In addition to fashion and medical uses, this unique feature may also be leveraged to build textile-based sensors and actuators.

Keywords: knitting, memory filament, shape memory, smart textiles, thermo-mechanical cycle

Procedia PDF Downloads 89
16555 Assessing the Effects of Entrepreneurship Education and Moderating Variables on Venture Creation Intention of Undergraduate Students in Ghana

Authors: Daniel K. Gameti

Abstract:

The paper explored the effects of active and passive entrepreneurship education methods on the venture creation intention of undergraduate students in Ghana. The study also examined the moderating effect of gender and negative personal characteristics (risk tolerance, stress tolerance and fear of failure) on students’ venture creation intention. Deductive approach was used in collecting quantitative data from 555 business students from one public university and one private university through self-administered questionnaires. Descriptive statistic was used to determine the dominant method of entrepreneurship education used in Ghana. Further, structural equation model was used to test four hypotheses. The results of the study show that the dominant method of education used in Ghana was lectures and the least method used was field trip. The study further revealed that passive methods of education are less effective compared to active methods which were statistically significant in venture creation intention among students. There was also statistical difference between male and female students’ venture creation intention but stronger among male students and finally, the only personal characteristics that influence students’ intention was stress tolerance because risk tolerance and fear of failure were statistically insignificant.

Keywords: entrepreneurship education, Ghana, moderating variables, venture creation intention, undergraduate students

Procedia PDF Downloads 453
16554 Effect of Alkaline Activator, Water, Superplasticiser and Slag Contents on the Compressive Strength and Workability of Slag-Fly Ash Based Geopolymer Mortar Cured under Ambient Temperature

Authors: M. Al-Majidi, A. Lampropoulos, A. Cundy

Abstract:

Geopolymer (cement-free) concrete is the most promising green alternative to ordinary Portland cement concrete and other cementitious materials. While a range of different geopolymer concretes have been produced, a common feature of these concretes is heat curing treatment which is essential in order to provide sufficient mechanical properties in the early age. However, there are several practical issues with the application of heat curing in large-scale structures. The purpose of this study is to develop cement-free concrete without heat curing treatment. Experimental investigations were carried out in two phases. In the first phase (Phase A), the optimum content of water, polycarboxylate based superplasticizer contents and potassium silicate activator in the mix was determined. In the second stage (Phase B), the effect of ground granulated blast furnace slag (GGBFS) incorporation on the compressive strength of fly ash (FA) and Slag based geopolymer mixtures was evaluated. Setting time and workability were also conducted alongside with compressive tests. The results showed that as the slag content was increased the setting time was reduced while the compressive strength was improved. The obtained compressive strength was in the range of 40-50 MPa for 50% slag replacement mixtures. Furthermore, the results indicated that increment of water and superplasticizer content resulted to retarding of the setting time and slight reduction of the compressive strength. The compressive strength of the examined mixes was considerably increased as potassium silicate content was increased.

Keywords: fly ash, geopolymer, potassium silicate, slag

Procedia PDF Downloads 223
16553 The Application of to Optimize Pellet Quality in Broiler Feeds

Authors: Reza Vakili

Abstract:

The aim of this experiment was to optimize the effect of moisture, the production rate, grain particle size and steam conditioning temperature on pellet quality in broiler feed using Taguchi method and a 43 fractional factorial arrangement was conducted. Production rate, steam conditioning temperatures, particle sizes and moisture content were performed. During the production process, sampling was done, and then pellet durability index (PDI) and hardness evaluated in broiler feed grower and finisher. There was a significant effect of processing parameters on PDI and hardness. Based on the results of this experiment Taguchi method can be used to find the best combination of factors for optimal pellet quality.

Keywords: broiler, feed physical quality, hardness, processing parameters, PDI

Procedia PDF Downloads 187