Search results for: permittivity measurement techniques
7533 A Metallography Study of Secondary A226 Aluminium Alloy Used in Automotive Industries
Authors: Lenka Hurtalová, Eva Tillová, Mária Chalupová, Juraj Belan, Milan Uhríčik
Abstract:
The secondary alloy A226 is used for many automotive casting produced by mould casting and high pressure die-casting. This alloy has excellent castability, good mechanical properties and cost-effectiveness. Production of primary aluminium alloys belong to heavy source fouling of life environs. The European Union calls for the emission reduction and reduction in energy consumption, therefore, increase production of recycled (secondary) aluminium cast alloys. The contribution is deal with influence of recycling on the quality of the casting made from A226 in automotive industry. The properties of the casting made from secondary aluminium alloys were compared with the required properties of primary aluminium alloys. The effect of recycling on microstructure was observed using combination different analytical techniques (light microscopy upon black-white etching, scanning electron microscopy-SEM upon deep etching and energy dispersive X-ray analysis-EDX). These techniques were used for the identification of the various structure parameters, which was used to compare secondary alloy microstructure with primary alloy microstructure.Keywords: A226 secondary aluminium alloy, deep etching, mechanical properties, recycling foundry aluminium alloy
Procedia PDF Downloads 5467532 Reducing Crash Risk at Intersections with Safety Improvements
Authors: Upal Barua
Abstract:
Crash risk at intersections is a critical safety issue. This paper examines the effectiveness of removing an existing off-set at an intersection by realignment, in reducing crashes. Empirical Bayes method was applied to conduct a before-and-after study to assess the effect of this safety improvement. The Transportation Safety Improvement Program in Austin Transportation Department completed several safety improvement projects at high crash intersections with a view to reducing crashes. One of the common safety improvement techniques applied was the realignment of intersection approaches removing an existing off-set. This paper illustrates how this safety improvement technique is applied at a high crash intersection from inception to completion. This paper also highlights the significant crash reductions achieved from this safety improvement technique applying Empirical Bayes method in a before-and-after study. The result showed that realignment of intersection approaches removing an existing off-set can reduce crashes by 53%. This paper also features the state of the art techniques applied in planning, engineering, designing and construction of this safety improvement, key factors driving the success, and lessons learned in the process.Keywords: crash risk, intersection, off-set, safety improvement technique, before-and-after study, empirical Bayes method
Procedia PDF Downloads 2467531 Electrochemical and Theoretical Quantum Approaches on the Inhibition of C1018 Carbon Steel Corrosion in Acidic Medium Containing Chloride Using Newly Synthesized Phenolic Schiff Bases Compounds
Authors: Hany M. Abd El-Lateef
Abstract:
Two novel Schiff bases, 5-bromo-2-[(E)-(pyridin-3-ylimino) methyl] phenol (HBSAP) and 5-bromo-2-[(E)-(quinolin-8-ylimino) methyl] phenol (HBSAQ) have been synthesized. They have been characterized by elemental analysis and spectroscopic techniques (UV–Vis, IR and NMR). Moreover, the molecular structure of HBSAP and HBSAQ compounds are determined by single crystal X-ray diffraction technique. The inhibition activity of HBSAP and HBSAQ for carbon steel in 3.5 %NaCl+0.1 M HCl for both short and long immersion time, at different temperatures (20-50 ºC), was investigated using electrochemistry and surface characterization. The potentiodynamic polarization shows that the inhibitors molecule is more adsorbed on the cathodic sites. Its efficiency increases with increasing inhibitor concentrations (92.8 % at the optimal concentration of 10-3 M for HBSAQ). Adsorption of the inhibitors on the carbon steel surface was found to obey Langmuir’s adsorption isotherm with physical/chemical nature of the adsorption, as it is shown also by scanning electron microscopy. Further, the electronic structural calculations using quantum chemical methods were found to be in a good agreement with the results of the experimental studies.Keywords: carbon steel, Schiff bases, corrosion inhibition, SEM, electrochemical techniques
Procedia PDF Downloads 3957530 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals
Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor
Abstract:
This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers
Procedia PDF Downloads 817529 Measuring the Cavitation Cloud by Electrical Impedance Tomography
Authors: Michal Malik, Jiri Primas, Darina Jasikova, Michal Kotek, Vaclav Kopecky
Abstract:
This paper is a case study dealing with the viability of using Electrical Impedance Tomography for measuring cavitation clouds in a pipe setup. The authors used a simple passive cavitation generator to cause a cavitation cloud, which was then recorded for multiple flow rates using electrodes in two measuring planes. The paper presents the results of the experiment, showing the used industrial grade tomography system ITS p2+ is able to measure the cavitation cloud and may be particularly useful for identifying the inception of cavitation in setups where other measuring tools may not be viable.Keywords: cavitation cloud, conductivity measurement, electrical impedance tomography, mechanically induced cavitation
Procedia PDF Downloads 2517528 Laboratory Model Tests on Encased Group Columns
Authors: Kausar Ali
Abstract:
There are several ground treatment techniques which may meet the twin objectives of increasing the bearing capacity with simultaneous reduction of settlements, but the use of stone columns is one of the most suited techniques for flexible structures such as embankments, oil storage tanks etc. that can tolerate some settlement and used worldwide. However, when the stone columns in very soft soils are loaded; stone columns undergo excessive settlement due to low lateral confinement provided by the soft soil, leading to the failure of the structure. The poor performance of stone columns under these conditions can be improved by encasing the columns with a suitable geosynthetic. In this study, the effect of reinforcement on bearing capacity of composite soil has been investigated by conducting laboratory model tests on floating and end bearing long stone columns with l/d ratio of 12. The columns were reinforced by providing geosynthetic encasement over varying column length (upper 25%, 50%, 75%, and 100% column length). In this study, a group of columns has been used instead of single column, because in the field, columns used for the purpose always remain in groups. The tests indicate that the encasement over the full column length gives higher failure stress as compared to the encasement over the partial column length for both floating and end bearing long columns. The performance of end-bearing columns was found much better than the floating columns.Keywords: geosynthetic, ground improvement, soft clay, stone column
Procedia PDF Downloads 4387527 Ilorin Traditional Architecture as a Good Example of a Green Building Design
Authors: Olutola Funmilayo Adekeye
Abstract:
Tradition African practice of architecture can be said to be deeply rooted in Green Architecture in concept, design and execution. A study into the ancient building techniques in Ilorin Emirate depicts prominent (eco-centric approach of) Green Architecture principles. In the Pre-colonial era before the introduction of modern architecture and Western building materials, the Nigeria traditional communities built their houses to meet their cultural, religious and social needs using mainly indigenous building materials such as mud (Amo), cowdung (Boto), straws (koriko), palm fronts (Imo-Ope) to mention a few. This research attempts to identify the various techniques of applying the traditional African principles of Green Architecture to Ilorin traditional buildings. It will examine and assess some case studies to understand the extent to which Green architecture principles have been applied to traditional building designs that are still preserved today in Ilorin, Nigeria. Furthermore, this study intends to answer many questions, which can be summarized into two basic questions which are: (1) What aspects of what today are recognized as important green architecture principles have been applied to Ilorin traditional buildings? (2) To what extent have the principles of green architecture applied to Ilorin traditional buildings been ways of demonstrating a cultural attachment to the earth as an expression of the African sense of human being as one with nature?Keywords: green architecture, Ilorin, traditional buildings, design principles, ecocentric, application
Procedia PDF Downloads 5577526 Designing Form, Meanings, and Relationships for Future Industrial Products. Case Study Observation of PAD
Authors: Elisabetta Cianfanelli, Margherita Tufarelli, Paolo Pupparo
Abstract:
The dialectical mediation between desires and objects or between mass production and consumption continues to evolve over time. This relationship is influenced both by variable geometries of contexts that are distant from the mere design of product form and by aspects rooted in the very definition of industrial design. In particular, the overcoming of macro-areas of innovation in the technological, social, cultural, formal, and morphological spheres, supported by recent theories in critical and speculative design, seems to be moving further and further away from the design of the formal dimension of advanced products. The articulated fabric of theories and practices that feed the definition of “hyperobjects”, and no longer objects describes a common tension in all areas of design and production of industrial products. The latter are increasingly detached from the design of the form and meaning of the same in mass productions, thus losing the quality of products capable of social transformation. For years we have been living in a transformative moment as regards the design process in the definition of the industrial product. We are faced with a dichotomy in which there is, on the one hand, a reactionary aversion to the new techniques of industrial production and, on the other hand, a sterile adoption of the techniques of mass production that we can now consider traditional. This ambiguity becomes even more evident when we talk about industrial products, and we realize that we are moving further and further away from the concepts of "form" as a synthesis of a design thought aimed at the aesthetic-emotional component as well as the functional one. The design of forms and their contents, as statutes of social acts, allows us to investigate the tension on mass production that crosses seasons, trends, technicalities, and sterile determinisms. The design culture has always determined the formal qualities of objects as a sum of aesthetic characteristics functional and structural relationships that define a product as a coherent unit. The contribution proposes a reflection and a series of practical experiences of research on the form of advanced products. This form is understood as a kaleidoscope of relationships through the search for an identity, the desire for democratization, and between these two, the exploration of the aesthetic factor. The study of form also corresponds to the study of production processes, technological innovations, the definition of standards, distribution, advertising, the vicissitudes of taste and lifestyles. Specifically, we will investigate how the genesis of new forms for new meanings introduces a change in the relative innovative production techniques. It becomes, therefore, fundamental to investigate, through the reflections and the case studies exposed inside the contribution, also the new techniques of production and elaboration of the forms of the products, as new immanent and determining element inside the planning process.Keywords: industrial design, product advanced design, mass productions, new meanings
Procedia PDF Downloads 1277525 Single-Molecule Analysis of Structure and Dynamics in Polymer Materials by Super-Resolution Technique
Authors: Hiroyuki Aoki
Abstract:
The physical properties of polymer materials are dependent on the conformation and molecular motion of a polymer chain. Therefore, the structure and dynamic behavior of the single polymer chain have been the most important concerns in the field of polymer physics. However, it has been impossible to directly observe the conformation of the single polymer chain in a bulk medium. In the current work, the novel techniques to study the conformation and dynamics of a single polymer chain are proposed. Since a fluorescence method is extremely sensitive, the fluorescence microscopy enables the direct detection of a single molecule. However, the structure of the polymer chain as large as 100 nm cannot be resolved by conventional fluorescence methods because of the diffraction limit of light. In order to observe the single chains, we developed the labeling method of polymer materials with a photo-switchable dye and the super-resolution microscopy. The real-space conformational analysis of single polymer chains with the spatial resolution of 15-20 nm was achieved. The super-resolution microscopy enables us to obtain the three-dimensional coordinates; therefore, we succeeded the conformational analysis in three dimensions. The direct observation by the nanometric optical microscopy would reveal the detailed information on the molecular processes in the various polymer systems.Keywords: polymer materials, single molecule, super-resolution techniques, conformation
Procedia PDF Downloads 3097524 Real Estate Price Classification Using Machine Learning Techniques
Authors: Hadeel Sulaiman Alamri, Mohamed Maher Ben Ismail, Ouiem Bchir
Abstract:
Abstract— The continued advances in Artificial Intelligence (AI) and Machine Learning (ML) have boosted the interest of tax authorities in developing smart solutions as efficient alternatives to their actual fraud detection mechanisms. In particular, the real estate data collected by the administrations promoted the efforts to develop advanced analytics models aimed at detecting fraudulent real estate transactions. Specifically, supervised and unsupervised Machine Learning techniques have been associated with the available large datasets to improve overall taxpayer compliance. This research introduces a machine-learning approach intended to classify land and building prices in Saudi Arabia. Specifically, it intends to group real estate transactions reported into homogeneous groups based on relevant features. Moreover, the proposed solution classifies the lands and buildings prices in Saudi city, neighborhood, and schema. In fact, the outcomes of the clustering task are fed into a supervised machine learning process to categorize future real estate transactions into “Fair”, “Under-valued” or “Over-valued” classes. In particular, the experimental findings indicate that associating clustering algorithms with Random Forest (RF) model yields an accuracy of 99%.Keywords: classification, clustering, machine learning, real estate price
Procedia PDF Downloads 107523 Evaluating the Small-Strain Mechanical Properties of Cement-Treated Clayey Soils Based on the Confining Pressure
Authors: Muhammad Akmal Putera, Noriyuki Yasufuku, Adel Alowaisy, Ahmad Rifai
Abstract:
Indonesia’s government has planned a project for a high-speed railway connecting the capital cities, Jakarta and Surabaya, about 700 km. Based on that location, it has been planning construction above the lowland soil region. The lowland soil region comprises cohesive soil with high water content and high compressibility index, which in fact, led to a settlement problem. Among the variety of railway track structures, the adoption of the ballastless track was used effectively to reduce the settlement; it provided a lightweight structure and minimized workspace. Contradictorily, deploying this thin layer structure above the lowland area was compensated with several problems, such as lack of bearing capacity and deflection behavior during traffic loading. It is necessary to combine with ground improvement to assure a settlement behavior on the clayey soil. Reflecting on the assurance of strength increment and working period, those were convinced by adopting methods such as cement-treated soil as the substructure of railway track. Particularly, evaluating mechanical properties in the field has been well known by using the plate load test and cone penetration test. However, observing an increment of mechanical properties has uncertainty, especially for evaluating cement-treated soil on the substructure. The current quality control of cement-treated soils was established by laboratory tests. Moreover, using small strain devices measurement in the laboratory can predict more reliable results that are identical to field measurement tests. Aims of this research are to show an intercorrelation of confining pressure with the initial condition of the Young modulus (E_o), Poisson ratio (υ_o) and Shear modulus (G_o) within small strain ranges. Furthermore, discrepancies between those parameters were also investigated. Based on the experimental result confirmed the intercorrelation between cement content and confining pressure with a power function. In addition, higher cement ratios have discrepancies, conversely with low mixing ratios.Keywords: amount of cement, elastic zone, high-speed railway, lightweight structure
Procedia PDF Downloads 1467522 Imp_hist-Si: Improved Hybrid Image Segmentation Technique for Satellite Imagery to Decrease the Segmentation Error Rate
Authors: Neetu Manocha
Abstract:
Image segmentation is a technique where a picture is parted into distinct parts having similar features which have a place with similar items. Various segmentation strategies have been proposed as of late by prominent analysts. But, after ultimate thorough research, the novelists have analyzed that generally, the old methods do not decrease the segmentation error rate. Then author finds the technique HIST-SI to decrease the segmentation error rates. In this technique, cluster-based and threshold-based segmentation techniques are merged together. After then, to improve the result of HIST-SI, the authors added the method of filtering and linking in this technique named Imp_HIST-SI to decrease the segmentation error rates. The goal of this research is to find a new technique to decrease the segmentation error rates and produce much better results than the HIST-SI technique. For testing the proposed technique, a dataset of Bhuvan – a National Geoportal developed and hosted by ISRO (Indian Space Research Organisation) is used. Experiments are conducted using Scikit-image & OpenCV tools of Python, and performance is evaluated and compared over various existing image segmentation techniques for several matrices, i.e., Mean Square Error (MSE) and Peak Signal Noise Ratio (PSNR).Keywords: satellite image, image segmentation, edge detection, error rate, MSE, PSNR, HIST-SI, linking, filtering, imp_HIST-SI
Procedia PDF Downloads 1437521 Identifying the Structural Components of Old Buildings from Floor Plans
Authors: Shi-Yu Xu
Abstract:
The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence
Procedia PDF Downloads 947520 Thermal Analysis of a Graphite Calorimeter for the Measurement of Absorbed Dose for Therapeutic X-Ray Beam
Authors: I.J. Kim, B.C. Kim, J.H. Kim, C.-Y. Yi
Abstract:
Heat transfer in a graphite calorimeter is analyzed by using the finite elements method. The calorimeter is modeled in 3D geometry. Quasi-adiabatic mode operation is realized in the simulation and the temperature rise by different sources of the ionizing radiation and electric heaters is compared, directly. The temperature distribution caused by the electric power was much different from that by the ionizing radiation because of its point-like localized heating. However, the temperature rise which was finally read by sensing thermistors agreed well to each other within 0.02 %.Keywords: graphite calorimeter, finite element analysis, heat transfer, quasi-adiabatic mode
Procedia PDF Downloads 4317519 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 2817518 Laboratory Measurement of Relative Permeability of Immiscible Fluids in Sand
Authors: Khwaja Naweed Seddiqi, Shigeo Honma
Abstract:
Relative permeability is the important parameter controlling the immiscible displacement of multiphase fluids flow in porous medium. The relative permeability for immiscible displacement of two-phase fluids flow (oil and water) in porous medium has been measured in this paper. As a result of the experiment, irreducible water saturation, Swi, residual oil saturation, Sor, and relative permeability curves for Kerosene, Heavy oil and Lubricant oil were determined successfully.Keywords: relative permeability, two-phase flow, immiscible displacement, porous medium
Procedia PDF Downloads 3117517 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks
Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam
Abstract:
In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion
Procedia PDF Downloads 1267516 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine
Authors: Hira Lal Gope, Hidekazu Fukai
Abstract:
The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine
Procedia PDF Downloads 1477515 Developing a Performance Measurement System for Arts-Based Initiatives: Action Research on Italian Corporate Museums
Authors: Eleonora Carloni, Michela Arnaboldi
Abstract:
In academia, the investigation of the relationship between cultural heritage and corporations is ubiquitous in several fields of studies. In practice corporations are more and more integrating arts and cultural heritage in their strategies for disparate benefits, such as: to foster customer’s purchase intention with authentic and aesthetic experiences, to improve their reputation towards local communities, and to motivate employees with creative thinking. There are diverse forms under which corporations set these artistic interventions, from sponsorships to arts-based training centers for employees, but scholars agree that the maximum expression of this cultural trend are corporate museums, growing in number and relevance. Corporate museums are museum-like settings, hosting artworks of corporations’ history and interests. In academia they have been ascribed as strategic asset and they have been associated with diverse uses for corporations’ benefits, from place for preservation of cultural heritage, to tools for public relations and cultural flagship stores. Previous studies have thus extensively but fragmentally studied the diverse benefits of corporate museum opening to corporations, with a lack of comprehensive approach and a digression on how to evaluate and report corporate museum’s performances. Stepping forward, the present study aims to investigate: 1) what are the key performance measures corporate museums need to report to the associated corporations; 2) how are the key performance measures reported to the concerned corporations. This direction of study is not only suggested as future direction in academia but it has solid basis in practice, aiming to answer to the need of corporate museums’ directors to account for corporate museum’s activities to the concerned corporation. Coherently, at an empirical level the study relies on action research method, whose distinctive feature is to develop practical knowledge through a participatory process. This paper indeed relies on the experience of a collaborative project between the researchers and a set of corporate museums in Italy, aimed at co-developing a performance measurement system. The project involved two steps: a first step, in which researchers derived the potential performance measures from literature along with exploratory interviews; a second step, in which researchers supported the pool of corporate museums’ directors in co-developing a set of key performance indicators for reporting. Preliminary empirical findings show that while scholars insist on corporate museums’ capability to develop networking relations, directors insist on the role of museums as internal supplier of knowledge for innovation goals. Moreover, directors stress museums’ cultural mission and outcomes as potential benefits for corporation, by remarking to include both cultural and business measures in the final tool. In addition, they give relevant attention to the wording used in humanistic terms while struggling to express all measures in economic terms. The paper aims to contribute to corporate museums’ and more broadly to arts-based initiatives’ literature in two directions. Firstly, it elaborates key performance measures with related indicators to report on cultural initiatives for corporations. Secondly, it provides evidence of challenges and practices to handle reporting on these initiatives, because of tensions arising from the co-existence of diverse perspectives, namely arts and business worlds.Keywords: arts-based initiative, corporate museum, hybrid organization, performance measurement
Procedia PDF Downloads 1797514 Study of Two MPPTs for Photovoltaic Systems Using Controllers Based in Fuzzy Logic and Sliding Mode
Authors: N. Ould cherchali, M. S. Boucherit, L. Barazane, A. Morsli
Abstract:
Photovoltaic power is widely used to supply isolated or unpopulated areas (lighting, pumping, etc.). Great advantage is that this source is inexhaustible, it offers great safety in use and it is clean. But the dynamic models used to describe a photovoltaic system are complicated and nonlinear and due to nonlinear I-V and P–V characteristics of photovoltaic generators, a maximum power point tracking technique (MPPT) is required to maximize the output power. In this paper, two online techniques of maximum power point tracking using robust controller for photovoltaic systems are proposed, the first technique use fuzzy logic controller (FLC) and the second use sliding mode controller (SMC) for photovoltaic systems. The two maximum power point tracking controllers receive the partial derivative of power as inputs, and the output is the duty cycle corresponding to maximum power. A Photovoltaic generator with Boost converter is developed using MATLAB/Simulink to verify the preferences of the proposed techniques. SMC technique provides a good tracking speed in fast changing irradiation and when the irradiation changes slowly or is constant the panel power of FLC technique presents a much smoother signal with less fluctuations.Keywords: fuzzy logic controller, maximum power point, photovoltaic system, tracker, sliding mode controller
Procedia PDF Downloads 5497513 Bioreactor for Cell-Based Impedance Measuring with Diamond Coated Gold Interdigitated Electrodes
Authors: Roman Matejka, Vaclav Prochazka, Tibor Izak, Jana Stepanovska, Martina Travnickova, Alexander Kromka
Abstract:
Cell-based impedance spectroscopy is suitable method for electrical monitoring of cell activity especially on substrates that cannot be easily inspected by optical microscope (without fluorescent markers) like decellularized tissues, nano-fibrous scaffold etc. Special sensor for this measurement was developed. This sensor consists of corning glass substrate with gold interdigitated electrodes covered with diamond layer. This diamond layer provides biocompatible non-conductive surface for cells. Also, a special PPFC flow cultivation chamber was developed. This chamber is able to fix sensor in place. The spring contacts are connecting sensor pads with external measuring device. Construction allows real-time live cell imaging. Combining with perfusion system allows medium circulation and generating shear stress stimulation. Experimental evaluation consist of several setups, including pure sensor without any coating and also collagen and fibrin coating was done. The Adipose derived stem cells (ASC) and Human umbilical vein endothelial cells (HUVEC) were seeded onto sensor in cultivation chamber. Then the chamber was installed into microscope system for live-cell imaging. The impedance measurement was utilized by vector impedance analyzer. The measured range was from 10 Hz to 40 kHz. These impedance measurements were correlated with live-cell microscopic imaging and immunofluorescent staining. Data analysis of measured signals showed response to cell adhesion of substrates, their proliferation and also change after shear stress stimulation which are important parameters during cultivation. Further experiments plan to use decellularized tissue as scaffold fixed on sensor. This kind of impedance sensor can provide feedback about cell culture conditions on opaque surfaces and scaffolds that can be used in tissue engineering in development artificial prostheses. This work was supported by the Ministry of Health, grants No. 15-29153A and 15-33018A.Keywords: bio-impedance measuring, bioreactor, cell cultivation, diamond layer, gold interdigitated electrodes, tissue engineering
Procedia PDF Downloads 3037512 Comparing Image Processing and AI Techniques for Disease Detection in Plants
Authors: Luiz Daniel Garay Trindade, Antonio De Freitas Valle Neto, Fabio Paulo Basso, Elder De Macedo Rodrigues, Maicon Bernardino, Daniel Welfer, Daniel Muller
Abstract:
Agriculture plays an important role in society since it is one of the main sources of food in the world. To help the production and yield of crops, precision agriculture makes use of technologies aiming at improving productivity and quality of agricultural commodities. One of the problems hampering quality of agricultural production is the disease affecting crops. Failure in detecting diseases in a short period of time can result in small or big damages to production, causing financial losses to farmers. In order to provide a map of the contributions destined to the early detection of plant diseases and a comparison of the accuracy of the selected studies, a systematic literature review of the literature was performed, showing techniques for digital image processing and neural networks. We found 35 interesting tool support alternatives to detect disease in 19 plants. Our comparison of these studies resulted in an overall average accuracy of 87.45%, with two studies very closer to obtain 100%.Keywords: pattern recognition, image processing, deep learning, precision agriculture, smart farming, agricultural automation
Procedia PDF Downloads 3837511 Process for Separating and Recovering Materials from Kerf Slurry Waste
Authors: Tarik Ouslimane, Abdenour Lami, Salaheddine Aoudj, Mouna Hecini, Ouahiba Bouchelaghem, Nadjib Drouiche
Abstract:
Slurry waste is a byproduct generated from the slicing process of multi-crystalline silicon ingots. This waste can be used as a secondary resource to recover high purity silicon which has a great economic value. From the management perspective, the ever increasing generation of kerf slurry waste loss leads to significant challenges for the photovoltaic industry due to the current low use of slurry waste for silicon recovery. Slurry waste, in most cases, contains silicon, silicon carbide, metal fragments and mineral-oil-based or glycol-based slurry vehicle. As a result, of the global scarcity of high purity silicon supply, the high purity silicon content in slurry has increasingly attracted interest for research. This paper presents a critical overview of the current techniques employed for high purity silicon recovery from kerf slurry waste. Hydrometallurgy is continuously a matter of study and research. However, in this review paper, several new techniques about the process of high purity silicon recovery from slurry waste are introduced. The purpose of the information presented is to improve the development of a clean and effective recovery process of high purity silicon from slurry waste.Keywords: Kerf-loss, slurry waste, silicon carbide, silicon recovery, photovoltaic, high purity silicon, polyethylen glycol
Procedia PDF Downloads 3147510 A Cloud Computing System Using Virtual Hyperbolic Coordinates for Services Distribution
Authors: Telesphore Tiendrebeogo, Oumarou Sié
Abstract:
Cloud computing technologies have attracted considerable interest in recent years. Thus, these latters have become more important for many existing database applications. It provides a new mode of use and of offer of IT resources in general. Such resources can be used “on demand” by anybody who has access to the internet. Particularly, the Cloud platform provides an ease to use interface between providers and users, allow providers to develop and provide software and databases for users over locations. Currently, there are many Cloud platform providers support large scale database services. However, most of these only support simple keyword-based queries and can’t response complex query efficiently due to lack of efficient in multi-attribute index techniques. Existing Cloud platform providers seek to improve performance of indexing techniques for complex queries. In this paper, we define a new cloud computing architecture based on a Distributed Hash Table (DHT) and design a prototype system. Next, we perform and evaluate our cloud computing indexing structure based on a hyperbolic tree using virtual coordinates taken in the hyperbolic plane. We show through our experimental results that we compare with others clouds systems to show our solution ensures consistence and scalability for Cloud platform.Keywords: virtual coordinates, cloud, hyperbolic plane, storage, scalability, consistency
Procedia PDF Downloads 4297509 Control Flow around NACA 4415 Airfoil Using Slot and Injection
Authors: Imine Zakaria, Meftah Sidi Mohamed El Amine
Abstract:
One of the most vital aerodynamic organs of a flying machine is the wing, which allows it to fly in the air efficiently. The flow around the wing is very sensitive to changes in the angle of attack. Beyond a value, there is a phenomenon of the boundary layer separation on the upper surface, which causes instability and total degradation of aerodynamic performance called a stall. However, controlling flow around an airfoil has become a researcher concern in the aeronautics field. There are two techniques for controlling flow around a wing to improve its aerodynamic performance: passive and active controls. Blowing and suction are among the active techniques that control the boundary layer separation around an airfoil. Their objective is to give energy to the air particles in the boundary layer separation zones and to create vortex structures that will homogenize the velocity near the wall and allow control. Blowing and suction have long been used as flow control actuators around obstacles. In 1904 Prandtl applied a permanent blowing to a cylinder to delay the boundary layer separation. In the present study, several numerical investigations have been developed to predict a turbulent flow around an aerodynamic profile. CFD code was used for several angles of attack in order to validate the present work with that of the literature in the case of a clean profile. The variation of the lift coefficient CL with the momentum coefficientKeywords: CFD, control flow, lift, slot
Procedia PDF Downloads 2057508 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic
Authors: Fei Gao, Rodolfo C. Raga Jr.
Abstract:
This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle
Procedia PDF Downloads 817507 Characterization of an Almond Shell Composite Based on PHBH
Authors: J. Ivorra-Martinez, L. Quiles-Carrillo, J. Gomez-Caturla, T. Boronat, R. Balart
Abstract:
The utilization of almond crop by-products to obtain PHBH-based composites was carried out by using an extrusion process followed by an injection to obtain test samples. To improve the properties of the resulting composite, the incorporation of OLA 8 as a coupling agent and plasticizer was additionally considered. A characterization process was carried out by the measurement of mechanical properties, thermal properties, surface morphology, and water absorption ability. The use of the almond residue allows obtaining composites based on PHBH with a higher environmental interest and lower cost.Keywords: almond shell, PHBH, composites, compatibilization
Procedia PDF Downloads 1107506 A Nucleic Acid Extraction Method for High-Viscosity Floricultural Samples
Authors: Harunori Kawabe, Hideyuki Aoshima, Koji Murakami, Minoru Kawakami, Yuka Nakano, David D. Ordinario, C. W. Crawford, Iri Sato-Baran
Abstract:
With the recent advances in gene editing technologies allowing the rewriting of genetic sequences, additional market growth in the global floriculture market beyond previous trends is anticipated through increasingly sophisticated plant breeding techniques. As a prerequisite for gene editing, the gene sequence of the target plant must first be identified. This necessitates the genetic analysis of plants with unknown gene sequences, the extraction of RNA, and comprehensive expression analysis. Consequently, a technology capable of consistently and effectively extracting high-purity DNA and RNA from plants is of paramount importance. Although model plants, such as Arabidopsis and tobacco, have established methods for DNA and RNA extraction, floricultural species such as roses present unique challenges. Different techniques to extract DNA and RNA from various floricultural species were investigated. Upon sampling and grinding the petals of several floricultural species, it was observed that nucleic acid extraction from the ground petal solutions of low viscosity was straightforward; solutions of high viscosity presented a significant challenge. It is postulated that the presence of substantial quantities of polysaccharides and polyphenols in the plant tissue was responsible for the inhibition of nucleic acid extraction. Consequently, attempts were made to extract high-purity DNA and RNA by improving the CTAB method and combining it with commercially available nucleic acid extraction kits. The quality of the total extracted DNA and RNA was evaluated using standard methods. Finally, the effectiveness of the extraction method was assessed by determining whether it was possible to create a library that could be applied as a suitable template for a next-generation sequencer. In conclusion, a method was developed for consistent and accurate nucleic acid extraction from high-viscosity floricultural samples. These results demonstrate improved techniques for DNA and RNA extraction from flowers, help facilitate gene editing of floricultural species and expand the boundaries of research and commercial opportunities.Keywords: floriculture, gene editing, next-generation sequencing, nucleic acid extraction
Procedia PDF Downloads 337505 A Geo DataBase to Investigate the Maximum Distance Error in Quality of Life Studies
Authors: Paolino Di Felice
Abstract:
The background and significance of this study come from papers already appeared in the literature which measured the impact of public services (e.g., hospitals, schools, ...) on the citizens’ needs satisfaction (one of the dimensions of QOL studies) by calculating the distance between the place where they live and the location on the territory of the services. Those studies assume that the citizens' dwelling coincides with the centroid of the polygon that expresses the boundary of the administrative district, within the city, they belong to. Such an assumption “introduces a maximum measurement error equal to the greatest distance between the centroid and the border of the administrative district.”. The case study, this abstract reports about, investigates the implications descending from the adoption of such an approach but at geographical scales greater than the urban one, namely at the three levels of nesting of the Italian administrative units: the (20) regions, the (110) provinces, and the 8,094 municipalities. To carry out this study, it needs to be decided: a) how to store the huge amount of (spatial and descriptive) input data and b) how to process them. The latter aspect involves: b.1) the design of algorithms to investigate the geometry of the boundary of the Italian administrative units; b.2) their coding in a programming language; b.3) their execution and, eventually, b.4) archiving the results in a permanent support. The IT solution we implemented is centered around a (PostgreSQL/PostGIS) Geo DataBase structured in terms of three tables that fit well to the hierarchy of nesting of the Italian administrative units: municipality(id, name, provinceId, istatCode, regionId, geometry) province(id, name, regionId, geometry) region(id, name, geometry). The adoption of the DBMS technology allows us to implement the steps "a)" and "b)" easily. In particular, step "b)" is simplified dramatically by calling spatial operators and spatial built-in User Defined Functions within SQL queries against the Geo DB. The major findings coming from our experiments can be summarized as follows. The approximation that, on the average, descends from assimilating the residence of the citizens with the centroid of the administrative unit of reference is of few kilometers (4.9) at the municipalities level, while it becomes conspicuous at the other two levels (28.9 and 36.1, respectively). Therefore, studies such as those mentioned above can be extended up to the municipal level without affecting the correctness of the interpretation of the results, but not further. The IT framework implemented to carry out the experiments can be replicated for studies referring to the territory of other countries all over the world.Keywords: quality of life, distance measurement error, Italian administrative units, spatial database
Procedia PDF Downloads 3747504 A Nanoindentation Study of Thin Film Prepared by Physical Vapor Deposition
Authors: Dhiflaoui Hafedh, Khlifi Kaouther, Ben Cheikh Larbi Ahmed
Abstract:
Monolayer and multilayer coatings of CrN and AlCrN deposited on 100Cr6 (AISI 52100) substrate by PVD magnetron sputtering system. The micro structures of the coatings were characterized using atomic force microscopy (AFM). The AFM analysis revealed the presence of domes and craters which are uniformly distributed over all surfaces of the various layers. Nano indentation measurement of CrN coating showed maximum hardness (H) and modulus (E) of 14 GPa and 240 GPa, respectively. The measured H and E values of AlCrN coatings were found to be 30 GPa and 382 GPa, respectively. The improved hardness in both the coatings was attributed mainly to a reduction in crystallite size and decrease in surface roughness. The incorporation of Al into the CrN coatings has improved both hardness and Young’s modulus.Keywords: CrN, AlCrN coatings, hardness, nanoindentation
Procedia PDF Downloads 561