Search results for: multiple P/D point
9226 Women's Contemporary Dystopias: Feminist Protagonists Taking Back Control
Authors: Natalia Fontes De Oliveira
Abstract:
The Canadian author Margaret Atwood deconstructs the tainted dichotomies between women and men by embracing the disorder throughout her dystopias. In Atwood’s The Testaments, nature can be seen as a background to the story as well as a metaphorical expression of the characters’ state of mind, nevertheless, the protagonists’ nature writing portrays conveys a curiosity to the pre-established sanctions of a docile garden, viewing nature as an autonomous entity, especially when they are away from the confinements of Gilead’s regime. The three narrating protagonists, Agnes, Aunt Lydia, and Nicole, use nature writing subversively as a form of rebellion. This paper investigates how the three protagonists narrate nature through an intimist point of view, with sensibility to observe the multiple relationships among humanity, nature, and the impositions of a theocratic ultra conservative patriarchal society.Keywords: contemporary literature, dystopias, feminism, women’s writing
Procedia PDF Downloads 1699225 Long Term Evolution Multiple-Input Multiple-Output Network in Unmanned Air Vehicles Platform
Authors: Ashagrie Getnet Flattie
Abstract:
Line-of-sight (LOS) information, data rates, good quality, and flexible network service are limited by the fact that, for the duration of any given connection, they experience severe variation in signal strength due to fading and path loss. Wireless system faces major challenges in achieving wide coverage and capacity without affecting the system performance and to access data everywhere, all the time. In this paper, the cell coverage and edge rate of different Multiple-input multiple-output (MIMO) schemes in 20 MHz Long Term Evolution (LTE) system under Unmanned Air Vehicles (UAV) platform are investigated. After some background on the enormous potential of UAV, MIMO, and LTE in wireless links, the paper highlights the presented system model which attempts to realize the various benefits of MIMO being incorporated into UAV platform. The performances of the three MIMO LTE schemes are compared with the performance of 4x4 MIMO LTE in UAV scheme carried out to evaluate the improvement in cell radius, BER, and data throughput of the system in different morphology. The results show that significant performance gains such as bit error rate (BER), data rate, and coverage can be achieved by using the presented scenario.Keywords: LTE, MIMO, path loss, UAV
Procedia PDF Downloads 2799224 Supply Chain Optimisation through Geographical Network Modeling
Authors: Cyrillus Prabandana
Abstract:
Supply chain optimisation requires multiple factors as consideration or constraints. These factors are including but not limited to demand forecasting, raw material fulfilment, production capacity, inventory level, facilities locations, transportation means, and manpower availability. By knowing all manageable factors involved and assuming the uncertainty with pre-defined percentage factors, an integrated supply chain model could be developed to manage various business scenarios. This paper analyse the utilisation of geographical point of view to develop an integrated supply chain network model to optimise the distribution of finished product appropriately according to forecasted demand and available supply. The supply chain optimisation model shows that small change in one supply chain constraint is possible to largely impact other constraints, and the new information from the model should be able to support the decision making process. The model was focused on three areas, i.e. raw material fulfilment, production capacity and finished products transportation. To validate the model suitability, it was implemented in a project aimed to optimise the concrete supply chain in a mining location. The high level of operations complexity and involvement of multiple stakeholders in the concrete supply chain is believed to be sufficient to give the illustration of the larger scope. The implementation of this geographical supply chain network modeling resulted an optimised concrete supply chain from raw material fulfilment until finished products distribution to each customer, which indicated by lower percentage of missed concrete order fulfilment to customer.Keywords: decision making, geographical supply chain modeling, supply chain optimisation, supply chain
Procedia PDF Downloads 3469223 A Spatial Point Pattern Analysis to Recognize Fail Bit Patterns in Semiconductor Manufacturing
Authors: Youngji Yoo, Seung Hwan Park, Daewoong An, Sung-Shick Kim, Jun-Geol Baek
Abstract:
The yield management system is very important to produce high-quality semiconductor chips in the semiconductor manufacturing process. In order to improve quality of semiconductors, various tests are conducted in the post fabrication (FAB) process. During the test process, large amount of data are collected and the data includes a lot of information about defect. In general, the defect on the wafer is the main causes of yield loss. Therefore, analyzing the defect data is necessary to improve performance of yield prediction. The wafer bin map (WBM) is one of the data collected in the test process and includes defect information such as the fail bit patterns. The fail bit has characteristics of spatial point patterns. Therefore, this paper proposes the feature extraction method using the spatial point pattern analysis. Actual data obtained from the semiconductor process is used for experiments and the experimental result shows that the proposed method is more accurately recognize the fail bit patterns.Keywords: semiconductor, wafer bin map, feature extraction, spatial point patterns, contour map
Procedia PDF Downloads 3849222 Development of a Practical Screening Measure for the Prediction of Low Birth Weight and Neonatal Mortality in Upper Egypt
Authors: Prof. Ammal Mokhtar Metwally, Samia M. Sami, Nihad A. Ibrahim, Fatma A. Shaaban, Iman I. Salama
Abstract:
Objectives: Reducing neonatal mortality by 2030 is still a challenging goal in developing countries. low birth weight (LBW) is a significant contributor to this, especially where weighing newborns is not possible routinely. The present study aimed to determine a simple, easy, reliable anthropometric measure(s) that can predict LBW) and neonatal mortality. Methods: A prospective cohort study of 570 babies born in districts of El Menia governorate, Egypt (where most deliveries occurred at home) was examined at birth. Newborn weight, length, head, chest, mid-arm, and thigh circumferences were measured. Follow up of the examined neonates took place during their first four weeks of life to report any mortalities. The most predictable anthropometric measures were determined using the statistical package of SPSS, and multiple Logistic regression analysis was performed.: Results: Head and chest circumferences with cut-off points < 33 cm and ≤ 31.5 cm, respectively, were the significant predictors for LBW. They carried the best combination of having the highest sensitivity (89.8 % & 86.4 %) and least false negative predictive value (1.4 % & 1.7 %). Chest circumference with a cut-off point ≤ 31.5 cm was the significant predictor for neonatal mortality with 83.3 % sensitivity and 0.43 % false negative predictive value. Conclusion: Using chest circumference with a cut-off point ≤ 31.5 cm is recommended as a single simple anthropometric measurement for the prediction of both LBW and neonatal mortality. The predicted measure could act as a substitute for weighting newborns in communities where scales to weigh them are not routinely available.Keywords: low birth weight, neonatal mortality, anthropometric measures, practical screening
Procedia PDF Downloads 999221 Evaluation of Photovoltaic System with Different Research Methods of Maximum Power Point Tracking
Authors: Mehdi Ameur, Ahmed Essadki, Tamou Nasser
Abstract:
The purpose of this paper is the evaluation of photovoltaic system with MPPT techniques. This system is developed by combining the models of established solar module and DC-DC converter with the algorithms of perturbing and observing (P&O), incremental conductance (INC) and fuzzy logic controller (FLC). The system is simulated under different climate conditions and MPPT algorithms to determine the influence of these conditions on characteristic power-voltage of PV system. According to the comparisons of the simulation results, the photovoltaic system can extract the maximum power with precision and rapidity using the MPPT algorithms discussed in this paper.Keywords: fuzzy logic controller, FLC, hill climbing, HC, incremental conductance (INC), perturb and observe (P&O), maximum power point, MPP, maximum power point tracking, MPPT
Procedia PDF Downloads 5119220 Transfer Knowledge From Multiple Source Problems to a Target Problem in Genetic Algorithm
Authors: Terence Soule, Tami Al Ghamdi
Abstract:
To study how to transfer knowledge from multiple source problems to the target problem, we modeled the Transfer Learning (TL) process using Genetic Algorithms as the model solver. TL is the process that aims to transfer learned data from one problem to another problem. The TL process aims to help Machine Learning (ML) algorithms find a solution to the problems. The Genetic Algorithms (GA) give researchers access to information that we have about how the old problem is solved. In this paper, we have five different source problems, and we transfer the knowledge to the target problem. We studied different scenarios of the target problem. The results showed combined knowledge from multiple source problems improves the GA performance. Also, the process of combining knowledge from several problems results in promoting diversity of the transferred population.Keywords: transfer learning, genetic algorithm, evolutionary computation, source and target
Procedia PDF Downloads 1409219 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry
Authors: Dhanuj M. Gandikota
Abstract:
Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry
Procedia PDF Downloads 1029218 Deep Learning Approach for Colorectal Cancer’s Automatic Tumor Grading on Whole Slide Images
Authors: Shenlun Chen, Leonard Wee
Abstract:
Tumor grading is an essential reference for colorectal cancer (CRC) staging and survival prognostication. The widely used World Health Organization (WHO) grading system defines histological grade of CRC adenocarcinoma based on the density of glandular formation on whole slide images (WSI). Tumors are classified as well-, moderately-, poorly- or un-differentiated depending on the percentage of the tumor that is gland forming; >95%, 50-95%, 5-50% and <5%, respectively. However, manually grading WSIs is a time-consuming process and can cause observer error due to subjective judgment and unnoticed regions. Furthermore, pathologists’ grading is usually coarse while a finer and continuous differentiation grade may help to stratifying CRC patients better. In this study, a deep learning based automatic differentiation grading algorithm was developed and evaluated by survival analysis. Firstly, a gland segmentation model was developed for segmenting gland structures. Gland regions of WSIs were delineated and used for differentiation annotating. Tumor regions were annotated by experienced pathologists into high-, medium-, low-differentiation and normal tissue, which correspond to tumor with clear-, unclear-, no-gland structure and non-tumor, respectively. Then a differentiation prediction model was developed on these human annotations. Finally, all enrolled WSIs were processed by gland segmentation model and differentiation prediction model. The differentiation grade can be calculated by deep learning models’ prediction of tumor regions and tumor differentiation status according to WHO’s defines. If multiple WSIs were possessed by a patient, the highest differentiation grade was chosen. Additionally, the differentiation grade was normalized into scale between 0 to 1. The Cancer Genome Atlas, project COAD (TCGA-COAD) project was enrolled into this study. For the gland segmentation model, receiver operating characteristic (ROC) reached 0.981 and accuracy reached 0.932 in validation set. For the differentiation prediction model, ROC reached 0.983, 0.963, 0.963, 0.981 and accuracy reached 0.880, 0.923, 0.668, 0.881 for groups of low-, medium-, high-differentiation and normal tissue in validation set. Four hundred and one patients were selected after removing WSIs without gland regions and patients without follow up data. The concordance index reached to 0.609. Optimized cut off point of 51% was found by “Maxstat” method which was almost the same as WHO system’s cut off point of 50%. Both WHO system’s cut off point and optimized cut off point performed impressively in Kaplan-Meier curves and both p value of logrank test were below 0.005. In this study, gland structure of WSIs and differentiation status of tumor regions were proven to be predictable through deep leaning method. A finer and continuous differentiation grade can also be automatically calculated through above models. The differentiation grade was proven to stratify CAC patients well in survival analysis, whose optimized cut off point was almost the same as WHO tumor grading system. The tool of automatically calculating differentiation grade may show potential in field of therapy decision making and personalized treatment.Keywords: colorectal cancer, differentiation, survival analysis, tumor grading
Procedia PDF Downloads 1349217 Yawning and Cortisol as a Potential Biomarker for Early Detection of Multiple Sclerosis
Authors: Simon B. N. Thompson
Abstract:
Cortisol is essential to the regulation of the immune system and yawning is a pathological symptom of multiple sclerosis (MS). Electromyography activity (EMG) in the jaw muscles typically rises when the muscles are moved and with yawning is highly correlated with cortisol levels in healthy people. Saliva samples from 59 participants were collected at the start and after yawning, or at the end of the presentation of yawning-provoking stimuli, in the absence of a yawn, together with EMG data and questionnaire data: Hospital Anxiety and Depression Scale, Yawning Susceptibility Scale, General Health Questionnaire, demographic, health details. Exclusion criteria: chronic fatigue, diabetes, fibromyalgia, heart condition, high blood pressure, hormone replacement therapy, multiple sclerosis, stroke. Significant differences were found between the saliva cortisol samples for the yawners, t (23) = -4.263, p = 0.000, as compared with the non-yawners between rest and post-stimuli, which was non-significant. Significant evidence was found to support the Thompson Cortisol Hypothesis suggesting that rises in cortisol levels are associated with yawning. Further research is exploring the use of cortisol as an early diagnostic tool for MS. Ethics approval granted and professional code of conduct, confidentiality, and safety issues are approved therein.Keywords: cortisol, multiple sclerosis, yawning, thompson cortisol hypothesis
Procedia PDF Downloads 3769216 Temporally Coherent 3D Animation Reconstruction from RGB-D Video Data
Authors: Salam Khalifa, Naveed Ahmed
Abstract:
We present a new method to reconstruct a temporally coherent 3D animation from single or multi-view RGB-D video data using unbiased feature point sampling. Given RGB-D video data, in form of a 3D point cloud sequence, our method first extracts feature points using both color and depth information. In the subsequent steps, these feature points are used to match two 3D point clouds in consecutive frames independent of their resolution. Our new motion vectors based dynamic alignment method then fully reconstruct a spatio-temporally coherent 3D animation. We perform extensive quantitative validation using novel error functions to analyze the results. We show that despite the limiting factors of temporal and spatial noise associated to RGB-D data, it is possible to extract temporal coherence to faithfully reconstruct a temporally coherent 3D animation from RGB-D video data.Keywords: 3D video, 3D animation, RGB-D video, temporally coherent 3D animation
Procedia PDF Downloads 3739215 Estimating 3D-Position of a Stationary Random Acoustic Source Using Bispectral Analysis of 4-Point Detected Signals
Authors: Katsumi Hirata
Abstract:
To develop the useful acoustic environmental recognition system, the method of estimating 3D-position of a stationary random acoustic source using bispectral analysis of 4-point detected signals is proposed. The method uses information about amplitude attenuation and propagation delay extracted from amplitude ratios and angles of auto- and cross-bispectra of the detected signals. It is expected that using bispectral analysis affects less influence of Gaussian noises than using conventional power spectral one. In this paper, the basic principle of the method is mentioned first, and its validity and features are considered from results of the fundamental experiments assumed ideal circumstances.Keywords: 4-point detection, a stationary random acoustic source, auto- and cross-bispectra, estimation of 3D-position
Procedia PDF Downloads 3599214 Applied Transdisciplinary Undergraduate Research in Costa Rica: Five Weeks Faculty-Led Study Abroad Model
Authors: Sara Shuger Fox, Oscar Reynaga
Abstract:
This session explains the process and lessons learned as Central College (USA) faculty and staff developed undergraduate research opportunities within the model of a short-term faculty-led study abroad program in Costa Rica. The program in Costa Rica increases access to research opportunities across the disciplines and was developed by faculty from English, Biology, and Exercise Science. Session attendees will benefit from learning how faculty and staff navigated the program proposal process at a small liberal arts college and, in particular, how the program was built to be inclusive of departments with lower enrollment, like those currently seen in the humanities. Vital to this last point, presenters will explain how they negotiated issues of research supervision and disciplinary authority in such a way that the program is open to students from multiple disciplines without forcing the program budget to absorb costs for multiple faculty supervisors traveling and living in-country. Additionally, session attendees will learn how scouting laid the groundwork for mutually beneficial relationships between the program and the communities with which it collaborates. Presenters will explain how they built a coalition of students, faculty advisors, study abroad staff and local research hosts to support the development of research questions that are of value not just to the students, but to the community in which the research will take place. This program also incorporates principles of fair-trade learning by intentionally reporting research findings to local community members, as well as encouraging students to proactively share their research as a way to connect with local people.Keywords: Costa Rica, research, sustainability, transdisciplinary
Procedia PDF Downloads 10609213 Studies on the Physicochemical Properties of Biolubricants Obtained from Vegetable Oils and Their Oxidative Stability
Authors: Expedito J. S. Parente Jr., Italo C. Rios, Joao Paulo C. Marques, Rosana M. A. Saboya, F. Murilo T. Luna, Célio L. Cavalcante Jr.
Abstract:
Increasing constraints of environmental regulation around the world have led to higher demand for biodegradable products. Vegetable oils present some properties that may favor their use as biolubricants; however, there are others, such as resistance to oxidation and pour point, which affect possible commercial applications. In this study, the physicochemical properties of biolubricants synthesized from different vegetable oils were evaluated and compared with petroleum-based lubricant and pure vegetable oil. Chemical modifications applied to the original vegetable oil improved their oxidative stability and pour point significantly. The addition of commercial antioxidants to the bio-based lubricants was evaluated, yielding values of oxidative stability close to those of mineral basestock oil.Keywords: biolubricant, vegetable oil, oxidative stability, pour point, antioxidants
Procedia PDF Downloads 3129212 Fault Prognostic and Prediction Based on the Importance Degree of Test Point
Authors: Junfeng Yan, Wenkui Hou
Abstract:
Prognostics and Health Management (PHM) is a technology to monitor the equipment status and predict impending faults. It is used to predict the potential fault and provide fault information and track trends of system degradation by capturing characteristics signals. So how to detect characteristics signals is very important. The select of test point plays a very important role in detecting characteristics signal. Traditionally, we use dependency model to select the test point containing the most detecting information. But, facing the large complicated system, the dependency model is not built so easily sometimes and the greater trouble is how to calculate the matrix. Rely on this premise, the paper provide a highly effective method to select test point without dependency model. Because signal flow model is a diagnosis model based on failure mode, which focuses on system’s failure mode and the dependency relationship between the test points and faults. In the signal flow model, a fault information can flow from the beginning to the end. According to the signal flow model, we can find out location and structure information of every test point and module. We break the signal flow model up into serial and parallel parts to obtain the final relationship function between the system’s testability or prediction metrics and test points. Further, through the partial derivatives operation, we can obtain every test point’s importance degree in determining the testability metrics, such as undetected rate, false alarm rate, untrusted rate. This contributes to installing the test point according to the real requirement and also provides a solid foundation for the Prognostics and Health Management. According to the real effect of the practical engineering application, the method is very efficient.Keywords: false alarm rate, importance degree, signal flow model, undetected rate, untrusted rate
Procedia PDF Downloads 3779211 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution
Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda
Abstract:
This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation
Procedia PDF Downloads 1479210 Improvement Perturb and Observe for a Fast Response MPPT Applied to Photovoltaic Panel
Authors: Labar Hocine, Kelaiaia Mounia Samira, Mesbah Tarek, Kelaiaia Samia
Abstract:
Maximum power point tracking (MPPT) techniques are used in photovoltaic (PV) systems to maximize the PV array output power by tracking continuously the maximum power point(MPP) which depends on panels temperature and on irradiance conditions. The main drawback of P&O is that, the operating point oscillates around the MPP giving rise to the waste of some amount of available energy; moreover, it is well known that the P&O algorithm can be confused during those time intervals characterized by rapidly changing atmospheric conditions. In this paper, it is shown that in order to limit the negative effects associated to the above drawbacks, the P&O MPPT parameters must be customized to the dynamic behavior of the specific converter adopted. A theoretical analysis allowing the optimal choice of such initial set parameters is also carried out. The fast convergence of the proposal is proven.Keywords: P&O, Taylor’s series, MPPT, photovoltaic panel
Procedia PDF Downloads 5879209 Holographic Visualisation of 3D Point Clouds in Real-time Measurements: A Proof of Concept Study
Authors: Henrique Fernandes, Sofia Catalucci, Richard Leach, Kapil Sugand
Abstract:
Background: Holograms are 3D images formed by the interference of light beams from a laser or other coherent light source. Pepper’s ghost is a form of hologram conceptualised in the 18th century. This Holographic visualisation with metrology measuring techniques by displaying measurements taken in real-time in holographic form can assist in research and education. New structural designs such as the Plexiglass Stand and the Hologram Box can optimise the holographic experience. Method: The equipment used included: (i) Zeiss’s ATOS Core 300 optical coordinate measuring instrument that scanned real-world objects; (ii) Cloud Compare, open-source software used for point cloud processing; and (iii) Hologram Box, designed and manufactured during this research to provide the blackout environment needed to display 3D point clouds in real-time measurements in holographic format, in addition to a portability aspect to holograms. The equipment was tailored to realise the goal of displaying measurements in an innovative technique and to improve on conventional methods. Three test scans were completed before doing a holographic conversion. Results: The outcome was a precise recreation of the original object in the holographic form presented with dense point clouds and surface density features in a colour map. Conclusion: This work establishes a way to visualise data in a point cloud system. To our understanding, this is a work that has never been attempted. This achievement provides an advancement in holographic visualisation. The Hologram Box could be used as a feedback tool for measurement quality control and verification in future smart factories.Keywords: holography, 3D scans, hologram box, metrology, point cloud
Procedia PDF Downloads 899208 Finite Element Modeling of Ultrasonic Shot Peening Process using Multiple Pin Impacts
Authors: Chao-xun Liu, Shi-hong Lu
Abstract:
In spite of its importance to the aerospace and automobile industries, little or no attention has been devoted to the accurate modeling of the ultrasonic shot peening (USP) process. It is therefore the purpose of this study to conduct finite element analysis of the process using a realistic multiple pin impacts model with the explicit solver of ABAQUS. In this paper, we research the effect of several key parameters on the residual stress distribution within the target, including impact velocity, incident angle, friction coefficient between pins and target and impact number of times were investigated. The results reveal that the impact velocity and impact number of times have obvious effect and impacting vertically could produce the most perfect residual stress distribution. Then we compare the results with the date in USP experiment and verify the exactness of the model. The analysis of the multiple pin impacts date reveal the relationships between peening process parameters and peening quality, which are useful for identifying the parameters which need to be controlled and regulated in order to produce a more beneficial compressive residual stress distribution within the target.Keywords: ultrasonic shot peening, finite element, multiple pins, residual stress, numerical simulation
Procedia PDF Downloads 4489207 Modification of Newton Method in Two Point Block Backward Differentiation Formulas
Authors: Khairil I. Othman, Nur N. Kamal, Zarina B. Ibrahim
Abstract:
In this paper, we present modified Newton method as a new strategy for improving the efficiency of Two Point Block Backward Differentiation Formulas (BBDF) when solving stiff systems of ordinary differential equations (ODEs). These methods are constructed to produce two approximate solutions simultaneously at each iteration The detailed implementation of the predictor corrector BBDF with PE(CE)2 with modified Newton are discussed. The proposed modification of BBDF is validated through numerical results on some standard problems found in the literature and comparisons are made with the existing Block Backward Differentiation Formula. Numerical results show the advantage of using the new strategy for solving stiff ODEs in improving the accuracy of the solution.Keywords: newton method, two point, block, accuracy
Procedia PDF Downloads 3579206 Classification of Regional Innovation Types and Region-Based Innovation Policies
Authors: Seongho Han, Dongkwan Kim
Abstract:
The focus of regional innovation policies is shifting from a central government to local governments. The central government demands that regions enforce autonomous and responsible regional innovation policies and that regional governments seek for innovation policies fit for regional characteristics. However, the central government and local governments have not arrived yet at a conclusion on what innovation policies are appropriate for regional circumstances. In particular, even if each local government is trying to find regional innovation strategies that are based on the needs of a region, its innovation strategies turn out to be similar with those of other regions. This leads to a consequence that is inefficient not only at a national level, but also at a regional level. Existing researches on regional innovation types point out that there are remarkable differences in the types or characteristics of innovation among the regions of a nation. In addition they imply that there would be no expected innovation output in cases in which policies are enforced with ignoring such differences. This means that it is undesirable to enforce regional innovation policies under a single standard. This research, given this problem, aims to find out the characteristics and differences in innovation types among the regions in Korea and suggests appropriate policy implications by classifying such characteristics and differences. This research, given these objectives, classified regions in consideration of the various indicators that comprise the innovation suggested by existing related researches and illustrated policies based on such characteristics and differences. This research used recent data, mainly from 2012, and as a methodology, clustering analysis based on multiple factor analysis was applied. Supplementary researches on dynamically analyzing stability in regional innovation types, establishing systematic indicators based on the regional innovation theory, and developing additional indicators are necessary in the future.Keywords: regional innovation policy, regional innovation type, region-based innovation, multiple factor analysis, clustering analysis
Procedia PDF Downloads 4759205 Evolution of Relations among Multiple Institutional Logics: A Case Study from a Higher Education Institution
Authors: Ye Jiang
Abstract:
To examine how the relationships among multiple institutional logics vary over time and the factors that may impact this process, we conducted a 15-year in-depth longitudinal case study of a Higher Education Institution to examine its exploration in college student management. By employing constructive grounded theory, we developed a four-stage process model comprising separation, formalization, selective bridging, and embeddedness that showed how two contradictory logics become complementary, and finally become a new hybridized logic. We argue that selective bridging is an important step in changing inter-logic relations. We also found that ambidextrous leadership and situational sensemaking are two key factors that drive this process. Our contribution to the literature is threefold. First, we enhance the literature on the changing relationships among multiple institutional logics and our findings advance the understanding of relationships between multiple logics through a dynamic view. While most studies have tended to assume that the relationship among logics is static and persistently in a contentious state, we contend that the relationships among multiple institutional logics can change over time. Competitive logics can become complementary, and a new hybridized logic can emerge therefrom. The four-stage logic hybridization process model offers insights on the logic hybridization process, which is underexplored in the literature. Second, our research reveals that selective bridging is important in making conflicting logics compatible, and thus constitutes a key step in creating new hybridized logic dynamics. Our findings suggest that the relations between multiple logics are manageable and can thus be manipulated for organizational innovation. Finally, the factors influencing the variations in inter-logic relations enrich the understanding of the antecedents of these dynamics.Keywords: institutional theory, institutional logics, ambidextrous leadership, situational sensemaking
Procedia PDF Downloads 1659204 Multiple-Channel Coulter Counter for Cell Sizing and Enumeration
Authors: Yu Chen, Seong-Jin Kim, Jaehoon Chung
Abstract:
High throughput cells counting and sizing are often required for biomedical applications. Here we report design, fabrication and validating of a micro-machined Coulter counter device with multiple-channel to realize such application for low cost. Multiple vertical through-holes were fabricated on a silicon chip, combined with the PDMS micro-fluidics channel that serves as the sensing channel. In order to avoid the crosstalk introduced by the electrical connection, instead of measuring the current passing through, the potential of each channel is monitored, thus the high throughput is possible. A peak of the output potential can be captured when the cell/particle is passing through the microhole. The device was validated by counting and sizing the polystyrene beads with diameter of 6 μm, 10 μm and 15 μm. With the sampling frequency to be set at 100 kHz, up to 5000 counts/sec for each channel can be realized. The counting and enumeration of MCF7 cancer cells are also demonstrated.Keywords: Coulter counter, cell enumeration, high through-put, cell sizing
Procedia PDF Downloads 6109203 Correlation between Potential Intelligence Explanatory Study in the Perspective of Multiple Intelligence Theory by Using Dermatoglyphics and Culture Approaches
Authors: Efnie Indrianie
Abstract:
Potential Intelligence constitutes one essential factor in every individual. This intelligence can be a provision for the development of Performance Intelligence if it is supported by surrounding environment. Fingerprint analysis is a method in recognizing this Potential Intelligence. This method is grounded on pattern and number of finger print outlines that are assumed symmetrical with the number of nerves in our brain, in which these areas have their own function among another. These brain’s functions are later being transposed into intelligence components in accordance with the Multiple Intelligences theory. This research tested the correlation between Potential Intelligence and the components of its Performance Intelligence. Statistical test results that used Pearson correlation showed that five components of Potential Intelligence correlated with Performance Intelligence. Those five components are Logic-Math, Logic, Linguistic, Music, Kinesthetic, and Intrapersonal. Also, this research indicated that cultural factor had a big role in shaping intelligence.Keywords: potential intelligence, performance intelligence, multiple intelligences, fingerprint, environment, brain
Procedia PDF Downloads 5359202 The Orthodontic Management of Multiple Tooth Agenesis with Macroglossia in Adult Patient: Case Report
Authors: Yanuarti Retnaningrum, Cendrawasih A. Farmasyanti, Kuswahyuning
Abstract:
Orthodontists find challenges in treating patients who have cases of macroglossia and multiple tooth agenesis because difficulties in determining the causes, formulating a diagnosis and the potential for relapse after treatment. Definition of macroglossia is a tongue enlargement due to muscle hypertrophy, tumor or an endocrine disturbance. Macroglossia may cause many problems such as anterior proclination of upper and lower incisors, development of general diastema and anterior and/ or posterior open bite. Treatment for such patients with multiple tooth agenesis and macroglossia can be complex and must consider orthodontic and/or surgical interventions. This article discusses an orthodontic non surgical approach to a patient with a general diastema in both maxilla and mandible associated with multiple tooth agenesis and macroglossia. Fixed orthodontic therapy with straightwire appliance was used for space closure in anterior region of maxilla and mandible, also to create a space suitable for future prosthetic restoration. After 12 months treatment, stable and functional occlusal relationships was achieved, although still have edentulous area in both maxilla and mandible. At the end of the orthodontic treatment was obtained with correct overbite and overjet values. After removal of the brackets, a maxillary and mandibular removable retainer combine with artificial tooth were placed for retention.Keywords: general diastema, macroglossia, space closure, tooth agenesis
Procedia PDF Downloads 1779201 Public Wi-Fi Security Threat Evil Twin Attack Detection Based on Signal Variant and Hop Count
Authors: Said Abdul Ahad Ahadi, Elyas Baray, Nitin Rakesh, Sudeep Varshney
Abstract:
Wi-Fi is a widely used internet source that is used to provide internet access in many areas such as Stores, Cafes, University campuses, Restaurants and so on. This technology brought more facilities in communication and networking. On the other hand, due to the transmission of data over the air, which makes the network vulnerable, so it becomes prone to various threats such as Evil Twin and etc. The Evil Twin is a kind of adversary which impersonates a legitimate access point (LAP) as it can happen by spoofing the name (SSID) and MAC address (BSSID) of a legitimate access point (LAP). And this attack can cause many threats such as MITM, Service Interruption, Access point service blocking. Various Evil Twin Attack Detection Techniques are proposed, but they require additional hardware, or they require protocol modification. In this paper, we proposed a new technique based on Access Point’s two fingerprints, Received Signal Strength Indicator (RSSI) and Hop Count, that is hard to copy by an adversary. And we implemented the technique in a system called “ETDetector,” which can detect and prevent the attack.Keywords: evil twin, LAP, SSID, Wi-Fi security, signal variation, ETAD, kali linux, scapy, python
Procedia PDF Downloads 1439200 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation
Authors: Sopheak Sorn, Kwok Yip Szeto
Abstract:
Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio
Procedia PDF Downloads 4199199 Optimal Location of the I/O Point in the Parking System
Authors: Jing Zhang, Jie Chen
Abstract:
In this paper, we deal with the optimal I/O point location in an automated parking system. In this system, the S/R machine (storage and retrieve machine) travels independently in vertical and horizontal directions. Based on the characteristics of the parking system and the basic principle of AS/RS system (Automated Storage and Retrieval System), we obtain the continuous model in units of time. For the single command cycle using the randomized storage policy, we calculate the probability density function for the system travel time and thus we develop the travel time model. And we confirm that the travel time model shows a good performance by comparing with discrete case. Finally in this part, we establish the optimal model by minimizing the expected travel time model and it is shown that the optimal location of the I/O point is located at the middle of the left-hand above corner.Keywords: parking system, optimal location, response time, S/R machine
Procedia PDF Downloads 4099198 Neuromingeal Cryptococcosis Revealing IgA-λ Multiple Myeloma
Authors: L. Mtibaa, N. Baccouchi, S. Hannechi, R. Abid, R. Battikh, B. Jemli
Abstract:
Cryptococcosis is an opportunistic fungal infection which is commonly associated with an immune-compomised state, especially HIV infection. Rare cases of cryptococcosis have been reported in patients with multiple myeloma (MM), and they are all at a late stage of the disease. However, the inaugural character of cryptococcosis revealing the MM at an early stage has never been reported to our best knowledge. We presented here a case of neuromeningeal cryptococcosis in a patient without any apparent underlying conditions, who has revealed IgA-λ MM. Early detection and treatment of cryptococcosis are essential to reduce morbidity and for a better outcome.Keywords: Cryptococcosis, Cryptococcus, hematologic, malignancy
Procedia PDF Downloads 1639197 A Source Point Distribution Scheme for Wave-Body Interaction Problem
Authors: Aichun Feng, Zhi-Min Chen, Jing Tang Xing
Abstract:
A two-dimensional linear wave-body interaction problem can be solved using a desingularized integral method by placing free surface Rankine sources over calm water surface and satisfying boundary conditions at prescribed collocation points on the calm water surface. A new free-surface Rankine source distribution scheme, determined by the intersection points of free surface and body surface, is developed to reduce numerical computation cost. Associated with this, a new treatment is given to the intersection point. The present scheme results are in good agreement with traditional numerical results and measurements.Keywords: source point distribution, panel method, Rankine source, desingularized algorithm
Procedia PDF Downloads 365