Search results for: continuous speed profile data
24666 Urban Boundary Layer and Its Effects on Haze Episode in Thailand
Authors: S. Bualert, K. Duangmal
Abstract:
Atmospheric boundary layer shows effects of land cover on atmospheric characteristic in term of temperature gradient and wind profile. They are key factors to control atmospheric process such as atmospheric dilution and mixing via thermal and mechanical turbulent. Bangkok, ChiangMai, and Hatyai are major cities of central, southern and northern of Thailand, respectively. The different of them are location, geography and size of the city, Bangkok is the most urbanized city and classified as mega city compared to ChiangMai and HatYai, respectively. They have been suffering from air pollution episode such as transboundary haze. The worst period of the northern part of Thailand was occurred at the end of February through April of each year. The particulate matter less than 10 micrometer (PM10) concentrations were higher than Thai’s ambient air quality standard (120 micrograms per cubic meter) more than two times. Radiosonde technique and air pollutant (CO, PM10, TSP, O3, NOx) measurements were used to identify characteristics of urban boundary layer and air pollutions problems in the cities. Furthermore, air pollutant profiles showed good relationship to characteristic’s urban boundary layer especially on daytime temperature inversion on 29 February 2009 caused two times higher than normal concentrations of CO and particulate matter.Keywords: haze episode, micrometeorology, temperature inversion, urban boundary layer
Procedia PDF Downloads 25924665 Effect of Zirconium Addition to Aluminum Grain Refined by Ti on its Resistance to Wear: A Three-Dimensional Approach
Authors: S. M. A. Al-Qawabah, A. I. O. Zaid
Abstract:
Aluminum and its alloys are versatile materials which are widely used in industrial and engineering applications due to their good and useful properties e.g. high strength to weight ratio, high thermal and electrical conductivities and good resistance to corrosion. However, against these favorable properties they have the disadvantage they solidifying large grain columnar structure which negatively affects their mechanical properties and surface quality. Aluminum alloys are normally grain refined by some alloying elements, such as Ti, Ti-B or Zr. In this paper, the effect of zirconium addition to Al grain refined by Ti after extrusion on its wear resistance is investigated under different loads and sliding speeds namely at 5,10 and 20 N loads and sliding speeds ranging from m/min. and m/min. the results are presented in three-dimensional wear mode. To the best the authors' knowledge, the wear of aluminum in 3-dimensions has never been tackled before. In this work, the wear resistance of by presenting the results of wear are presented and discussed on the time, load and speed plots.Keywords: aluminum grain refined, addition of titanium, wear resistance, titanium
Procedia PDF Downloads 40124664 Objective Evaluation on Medical Image Compression Using Wavelet Transformation
Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah
Abstract:
The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation
Procedia PDF Downloads 28624663 Agile Manifesto Construct for the Film Industry
Authors: Kiri Trier, Theresa Treffers
Abstract:
In the course of continuous volatility like production stops due to the COVID-19 pandemic, video-on-demand player monopolizing the film industry, filmmakers are stuck in traditional, linear content development processes. The industry has to become more agile in order to react quickly and easily to changes. Since content development in agile project management is scientifically–empirically not at all recorded, and a lack beyond the software development in terms of agile methods consists, we examined if the agile manifesto values and principles from the software development can be adapted to the film industry to enable agility and digitalization of content development in the industry. We conducted an online questionnaire with 184 German filmmakers (producers, authors, directors, actors, film financiers) for a first cross-sectional assessment for adaptability of the agile manifesto from the software development to the film industry, factor analysis was used to validate the construct. Our results show that it is crucial to digitalize traditional content development to agile content development end-to-end, with tools, lean processes, new collaboration structures, and holacracy to prepare for any volatility. Overall, we examined the first construct for an agile manifesto for the film industry with four values related to nine own principles. Our findings help to get a better understanding of the agile manifesto beyond the software development as a guideline for implementing agility in the film industry.Keywords: agile manifesto, agile project management, agility, film industry
Procedia PDF Downloads 19924662 Aerogel Fabrication Via Modified Rapid Supercritical Extraction (RSCE) Process - Needle Valve Pressure Release
Authors: Haibo Zhao, Thomas Andre, Katherine Avery, Alper Kiziltas, Deborah Mielewski
Abstract:
Silica aerogels were fabricated through a modified rapid supercritical extraction (RSCE) process. The silica aerogels were made using a tetramethyl orthosilicate precursor and then placed in a hot press and brought to the supercritical point of the solvent, ethanol. In order to control the pressure release without a pressure controller, a needle valve was used. The resulting aerogels were then characterized for their physical and chemical properties and compared to silica aerogels created using similar methods. The aerogels fabricated using this modified RSCE method were found to have similar properties to those in other papers using the unmodified RSCE method. Silica aerogel infused glass blanket composite, graphene reinforced silica aerogel composite were also successfully fabricated by this new method. The modified RSCE process and system is a prototype for better gas outflow control with a lower cost of equipment setup. Potentially, this process could be evolved to a continuous low-cost high-volume production process to meet automotive requirements.Keywords: aerogel, automotive, rapid supercritical extraction process, low cost production
Procedia PDF Downloads 18424661 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction
Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun
Abstract:
The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.Keywords: usability, qualitative data, text-processing algorithm, natural language processing
Procedia PDF Downloads 28524660 Keeping Education Non-Confessional While Teaching Children about Religion
Authors: Tünde Puskás, Anita Andersson
Abstract:
This study is part of a research project about whether religion is considered as part of Swedish cultural heritage in Swedish preschools. Our aim in this paper is to explore how a Swedish preschool balance between keeping the education non-confessional and at the same time teaching children about a particular tradition, Easter.The paper explores how in a Swedish preschool with a religious profile teachers balance between keeping education non-confessional and teaching about a tradition with religious roots. The point of departure for the theoretical frame of our study is that practical considerations in pedagogical situations are inherently dilemmatic. The dilemmas that are of interest for our study evolve around formalized, intellectual ideologies, such us multiculturalism and secularism that have an impact on everyday practice. Educational dilemmas may also arise in the intersections of the formalized ideology of non-confessionalism, prescribed in policy documents and the common sense understandings of what is included in what is understood as Swedish cultural heritage. In this paper, religion is treated as a human worldview that, similarly to secular ideologies, can be understood as a system of thought. We make use of Ninian Smart's theoretical framework according to which in modern Western world religious and secular ideologies, as human worldviews, can be studied from the same analytical framework. In order to be able to study the distinctive character of human worldviews Smart introduced a multi-dimensional model within which the different dimensions interact with each other in various ways and to different degrees. The data for this paper is drawn from fieldwork carried out in 2015-2016 in the form of video ethnography. The empirical material chosen consists of a video recording of a specific activity during which the preschool group took part in an Easter play performed in the local church. The analysis shows that the policy of non-confessionalism together with the idea that teaching covering religious issues must be purely informational leads in everyday practice to dilemmas about what is considered religious. At the same time what the adults actually do with religion fulfills six of seven dimensions common to religious traditions as outlined by Smart. What we can also conclude from the analysis is that whether it is religion or a cultural tradition that is thought through the performance the children watched in the church depends on how the concept of religion is defined. The analysis shows that the characters of the performance themselves understood religion as the doctrine of Jesus' resurrection from the dead. This narrow understanding of religion enabled them indirectly to teach about the traditions and narratives surrounding Easter while avoiding teaching religion as a belief system.Keywords: non-confessional education, preschool, religion, tradition
Procedia PDF Downloads 15924659 Differentiation between Different Rangeland Sites Using Principal Component Analysis in Semi-Arid Areas of Sudan
Authors: Nancy Ibrahim Abdalla, Abdelaziz Karamalla Gaiballa
Abstract:
Rangelands in semi-arid areas provide a good source for feeding huge numbers of animals and serving environmental, economic and social importance; therefore, these areas are considered economically very important for the pastoral sector in Sudan. This paper investigates the means of differentiating between different rangelands sites according to soil types using principal component analysis to assist in monitoring and assessment purposes. Three rangeland sites were identified in the study area as flat sandy sites, sand dune site, and hard clay site. Principal component analysis (PCA) was used to reduce the number of factors needed to distinguish between rangeland sites and produce a new set of data including the most useful spectral information to run satellite image processing. It was performed using selected types of data (two vegetation indices, topographic data and vegetation surface reflectance within the three bands of MODIS data). Analysis with PCA indicated that there is a relatively high correspondence between vegetation and soil of the total variance in the data set. The results showed that the use of the principal component analysis (PCA) with the selected variables showed a high difference, reflected in the variance and eigenvalues and it can be used for differentiation between different range sites.Keywords: principal component analysis, PCA, rangeland sites, semi-arid areas, soil types
Procedia PDF Downloads 18624658 Characterizing the Spatially Distributed Differences in the Operational Performance of Solar Power Plants Considering Input Volatility: Evidence from China
Authors: Bai-Chen Xie, Xian-Peng Chen
Abstract:
China has become the world's largest energy producer and consumer, and its development of renewable energy is of great significance to global energy governance and the fight against climate change. The rapid growth of solar power in China could help achieve its ambitious carbon peak and carbon neutrality targets early. However, the non-technical costs of solar power in China are much higher than at international levels, meaning that inefficiencies are rooted in poor management and improper policy design and that efficiency distortions have become a serious challenge to the sustainable development of the renewable energy industry. Unlike fossil energy generation technologies, the output of solar power is closely related to the volatile solar resource, and the spatial unevenness of solar resource distribution leads to potential efficiency spatial distribution differences. It is necessary to develop an efficiency evaluation method that considers the volatility of solar resources and explores the mechanism of the influence of natural geography and social environment on the spatially varying characteristics of efficiency distribution to uncover the root causes of managing inefficiencies. The study sets solar resources as stochastic inputs, introduces a chance-constrained data envelopment analysis model combined with the directional distance function, and measures the solar resource utilization efficiency of 222 solar power plants in representative photovoltaic bases in northwestern China. By the meta-frontier analysis, we measured the characteristics of different power plant clusters and compared the differences among groups, discussed the mechanism of environmental factors influencing inefficiencies, and performed statistical tests through the system generalized method of moments. Rational localization of power plants is a systematic project that requires careful consideration of the full utilization of solar resources, low transmission costs, and power consumption guarantee. Suitable temperature, precipitation, and wind speed can improve the working performance of photovoltaic modules, reasonable terrain inclination can reduce land cost, and the proximity to cities strongly guarantees the consumption of electricity. The density of electricity demand and high-tech industries is more important than resource abundance because they trigger the clustering of power plants to result in a good demonstration and competitive effect. To ensure renewable energy consumption, increased support for rural grids and encouraging direct trading between generators and neighboring users will provide solutions. The study will provide proposals for improving the full life-cycle operational activities of solar power plants in China to reduce high non-technical costs and improve competitiveness against fossil energy sources.Keywords: solar power plants, environmental factors, data envelopment analysis, efficiency evaluation
Procedia PDF Downloads 9124657 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones
Authors: Mohamed Abdelkareem
Abstract:
Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.Keywords: GIS, remote sensing, groundwater, Egypt
Procedia PDF Downloads 9824656 Intelligent Production Machine
Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan
Abstract:
This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.Keywords: cutting process, sound processing, intelligent late, sound analysis
Procedia PDF Downloads 33424655 The Modeling of City Bus Fuel Economy during the JE05 Emission Test Cycle
Authors: Miroslaw Wendeker, Piotr Kacejko, Marcin Szlachetka, Mariusz Duk
Abstract:
This paper discusses a model of fuel economy in a city bus driving in a dynamic urban environment. Rapid changes in speed result in a constantly changing kinetic energy accumulated in a bus mass and an increased fuel consumption due to hardly recuperated kinetic energy. The model is based on the bench test results achieved from chassis dynamometer, airport and city street researches. The verified model was applied to simulate the behavior of a bus during the Japanese JE05 Emission Test Cycle. The fuel consumption was calculated for three separate research stages, i.e. urban, downtown and motorway. The simulations were performed for several values of vehicle mass and electrical load applied to on-board devices. The research results show fuel consumption is impacted by driving dynamics.Keywords: city bus, heavy duty vehicle, Japanese JE05 test cycle, kinetic energy
Procedia PDF Downloads 31624654 Unsteady Numerical Analysis of Sediment Erosion Affected High Head Francis Turbine
Authors: Saroj Gautam, Ram Lama, Hari Prasad Neopane, Sailesh Chitrakar, Biraj Singh Thapa, Baoshan Zhu
Abstract:
Sediment flowing along with the water in rivers flowing in South Asia erodes the turbine components. The erosion of turbine components is influenced by the nature of fluid flow along with components of typical turbine types. This paper examines two cases of high head Francis turbines with the same speed number numerically. The numerical investigation involves both steady-state and transient analysis of the numerical model developed for both cases. Furthermore, the influence of leakage flow from the clearance gap of guide vanes is also examined and compared with no leakage flow. It presents the added pressure pulsation to rotor-stator-interaction in the turbine runner for both cases due to leakage flow. It was also found that leakage flow was a major contributor to the sediment erosion in those turbines.Keywords: sediment erosion, Francis turbine, leakage flow, rotor stator interaction
Procedia PDF Downloads 18524653 The Effectiveness and Accuracy of the Schulte Holt IOL Toric Calculator Processor in Comparison to Manually Input Data into the Barrett Toric IOL Calculator
Authors: Gabrielle Holt
Abstract:
This paper is looking to prove the efficacy of the Schulte Holt IOL Toric Calculator Processor (Schulte Holt ITCP). It has been completed using manually inputted data into the Barrett Toric Calculator and comparing the number of minutes taken to complete the Toric calculations, the number of errors identified during completion, and distractions during completion. It will then compare that data to the number of minutes taken for the Schulte Holt ITCP to complete also, using the Barrett method, as well as the number of errors identified in the Schulte Holt ITCP. The data clearly demonstrate a momentous advantage to the Schulte Holt ITCP and notably reduces time spent doing Toric Calculations, as well as reducing the number of errors. With the ever-growing number of cataract surgeries taking place around the world and the waitlists increasing -the Schulte Holt IOL Toric Calculator Processor may well demonstrate a way forward to increase the availability of ophthalmologists and ophthalmic staff while maintaining patient safety.Keywords: Toric, toric lenses, ophthalmology, cataract surgery, toric calculations, Barrett
Procedia PDF Downloads 9424652 Change Point Detection Using Random Matrix Theory with Application to Frailty in Elderly Individuals
Authors: Malika Kharouf, Aly Chkeir, Khac Tuan Huynh
Abstract:
Detecting change points in time series data is a challenging problem, especially in scenarios where there is limited prior knowledge regarding the data’s distribution and the nature of the transitions. We present a method designed for detecting changes in the covariance structure of high-dimensional time series data, where the number of variables closely matches the data length. Our objective is to achieve unbiased test statistic estimation under the null hypothesis. We delve into the utilization of Random Matrix Theory to analyze the behavior of our test statistic within a high-dimensional context. Specifically, we illustrate that our test statistic converges pointwise to a normal distribution under the null hypothesis. To assess the effectiveness of our proposed approach, we conduct evaluations on a simulated dataset. Furthermore, we employ our method to examine changes aimed at detecting frailty in the elderly.Keywords: change point detection, hypothesis tests, random matrix theory, frailty in elderly
Procedia PDF Downloads 5324651 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use
Authors: Isaura Esther Solano Núñez, David Suarez
Abstract:
The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.Keywords: malnutrition, data mining, analytical, descriptive, population, Wayuu, indigenous
Procedia PDF Downloads 15924650 Application of the Mobile Phone for Occupational Self-Inspection Program in Small-Scale Industries
Authors: Jia-Sin Li, Ying-Fang Wang, Cheing-Tong Yan
Abstract:
In this study, an integrated approach of Google Spreadsheet and QR code which is free internet resources was used to improve the inspection procedure. The mobile phone Application(App)was also designed to combine with a web page to create an automatic checklist in order to provide a new integrated information of inspection management system. By means of client-server model, the client App is developed for Android mobile OS and the back end is a web server. It can set up App accounts including authorized data and store some checklist documents in the website. The checklist document URL could generate QR code first and then print and paste on the machine. The user can scan the QR code by the app and filled the checklist in the factory. In the meanwhile, the checklist data will send to the server, it not only save the filled data but also executes the related functions and charts. On the other hand, it also enables auditors and supervisors to facilitate the prevention and response to hazards, as well as immediate report data checks. Finally, statistics and professional analysis are performed using inspection records and other relevant data to not only improve the reliability, integrity of inspection operations and equipment loss control, but also increase plant safety and personnel performance. Therefore, it suggested that the traditional paper-based inspection method could be replaced by the APP which promotes the promotion of industrial security and reduces human error.Keywords: checklist, Google spreadsheet, APP, self-inspection
Procedia PDF Downloads 11824649 Evaluation of the Fire Propagation Characteristics of Thermoplastics
Authors: Ji-Hun Choi, Kyoung-Suk Cho, Seung-Un Chae
Abstract:
Consisting of organic compounds, plastic ignites easily and burns fast. In addition, a large amount of toxic gas is produced while it is burning. When plastic is heated, its volume decreases because its surface is melted. The decomposition of its molecular bond generates combustible liquid of low viscosity, which accelerates plastic combustion and spreads the flames. Radiant heat produced in the process propagates the fire to increase the risk of human and property damages. Accordingly, the purpose of this study was to identify chemical, thermal and combustion characteristics of thermoplastic plastics using the fire propagation apparatus based on experimental criteria of ISO 12136 and ASTM E 2058. By the experiment result, as the ignition time increased, the thermal response parameter (TRP) decreased and as the TRP increased, the slope decreased. In other words, the large the TRP was, the longer the time taken for heating and ignition of the material was. It was identified that the fire propagation speed dropped accordingly.Keywords: fire propagation apparatus (FPA), ISO 12136, thermal response parameter (TRP), fire propagation index (FPI)
Procedia PDF Downloads 20224648 High Performance Direct Torque Control for Induction Motor Drive Fed from Photovoltaic System
Authors: E. E. EL-Kholy, Ahamed Kalas, Mahmoud Fauzy, M. El-Shahat Dessouki, Abdou M. El-refay, Mohammed El-Zefery
Abstract:
Direct Torque Control (DTC) is an AC drive control method especially designed to provide fast and robust responses. In this paper a progressive algorithm for direct torque control of three-phase induction drive system supplied by photovoltaic arrays using voltage source inverter to control motor torque and flux with maximum power point tracking at different level of insolation is presented. Experimental results of the new DTC method obtained by an experimental rapid prototype system for drives are presented. Simulation and experimental results confirm that the proposed system gives quick, robust torque and speed responses at constant switching frequencies.Keywords: photovoltaic (PV) array, direct torque control (DTC), constant switching frequency, induction motor, maximum power point tracking (MPPT)
Procedia PDF Downloads 48224647 Industry 4.0 and Supply Chain Integration: Case of Tunisian Industrial Companies
Authors: Rym Ghariani, Ghada Soltane, Younes Boujelbene
Abstract:
Industry 4.0, a set of emerging smart and digital technologies, has been the main focus of operations management researchers and practitioners in recent years. The objective of this research paper is to study the impact of Industry 4.0 on the integration of the supply chain (SCI) in Tunisian industrial companies. A conceptual model to study the relationship between Industry 4.0 technologies and supply chain integration was designed. This model contains three explained variables (Big data, Internet of Things, and Robotics) and one variable to be explained (supply chain integration). In order to answer our research questions and investigate the research hypotheses, principal component analysis and discriminant analysis were used using SPSS26 software. The results reveal that there is a statistically positive impact significant impact of Industry 4.0 (Big data, Internet of Things and Robotics) on the integration of the supply chain. Interestingly, big data has a greater positive impact on supply chain integration than the Internet of Things and robotics.Keywords: industry 4.0 (I4.0), big data, internet of things, robotics, supply chain integration
Procedia PDF Downloads 5924646 Specific Earthquake Ground Motion Levels That Would Affect Medium-To-High Rise Buildings
Authors: Rhommel Grutas, Ishmael Narag, Harley Lacbawan
Abstract:
Construction of high-rise buildings is a means to address the increasing population in Metro Manila, Philippines. The existence of the Valley Fault System within the metropolis and other nearby active faults poses threats to a densely populated city. The distant, shallow and large magnitude earthquakes have the potential to generate slow and long-period vibrations that would affect medium-to-high rise buildings. Heavy damage and building collapse are consequences of prolonged shaking of the structure. If the ground and the building have almost the same period, there would be a resonance effect which would cause the prolonged shaking of the building. Microzoning the long-period ground response would aid in the seismic design of medium to high-rise structures. The shear-wave velocity structure of the subsurface is an important parameter in order to evaluate ground response. Borehole drilling is one of the conventional methods of determining shear-wave velocity structure however, it is an expensive approach. As an alternative geophysical exploration, microtremor array measurements can be used to infer the structure of the subsurface. Microtremor array measurement system was used to survey fifty sites around Metro Manila including some municipalities of Rizal and Cavite. Measurements were carried out during the day under good weather conditions. The team was composed of six persons for the deployment and simultaneous recording of the microtremor array sensors. The instruments were laid down on the ground away from sewage systems and leveled using the adjustment legs and bubble level. A total of four sensors were deployed for each site, three at the vertices of an equilateral triangle with one sensor at the centre. The circular arrays were set up with a maximum side length of approximately four kilometers and the shortest side length for the smallest array is approximately at 700 meters. Each recording lasted twenty to sixty minutes. From the recorded data, f-k analysis was applied to obtain phase velocity curves. Inversion technique is applied to construct the shear-wave velocity structure. This project provided a microzonation map of the metropolis and a profile showing the long-period response of the deep sedimentary basin underlying Metro Manila which would be suitable for local administrators in their land use planning and earthquake resistant design of medium to high-rise buildings.Keywords: earthquake, ground motion, microtremor, seismic microzonation
Procedia PDF Downloads 46824645 Analysing Competitive Advantage of IoT and Data Analytics in Smart City Context
Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue
Abstract:
The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic has not only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of normal design, construction, and operation of cities provides a unique opportunity to improve the connection between people. The Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the research contribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.Keywords: data analytics, smart cities, competitive advantage, internet of things
Procedia PDF Downloads 13324644 Best Season for Seismic Survey in Zaria Area, Nigeria: Data Quality and Implications
Authors: Ibe O. Stephen, Egwuonwu N. Gabriel
Abstract:
Variations in seismic P-wave velocity and depth resolution resulting from variations in subsurface water saturation were investigated in this study in order to determine the season of the year that gives the most reliable P-wave velocity and depth resolution of the subsurface in Zaria Area, Nigeria. A 2D seismic refraction tomography technique involving an ABEM Terraloc MK6 Seismograph was used to collect data across a borehole of standard log with the centre of the spread situated at the borehole site. Using the same parameters this procedure was repeated along the same spread for at least once in a month for at least eight months in a year for four years. The choice for each survey time depended on when there was significant variation in rainfall data. The seismic data collected were tomographically inverted. The results suggested that the average P-wave velocity ranges of the subsurface in the area are generally higher when the ground was wet than when it was dry. The results also suggested that the overburden of about 9.0 m in thickness, the weathered basement of about 14.0 m in thickness and the fractured basement at a depth of about 23.0 m best fitted the borehole log. This best fit was consistently obtained in the months between March and May when the average total rainfall was about 44.8 mm in the area. The results had also shown that the velocity ranges in both dry and wet formations fall within the standard ranges as provided in literature. In terms of velocity, this study has not in any way clearly distinguished the quality of the results of the seismic data obtained when the subsurface was dry from the results of the data collected when the subsurface was wet. It was concluded that for more detailed and reliable seismic studies in Zaria Area and its environs with similar climatic condition, the surveys are best conducted between March and May. The most reliable seismic data for depth resolution are most likely obtainable in the area between March and May.Keywords: best season, variations in depth resolution, variations in P-wave velocity, variations in subsurface water saturation, Zaria area
Procedia PDF Downloads 28924643 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices
Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues
Abstract:
This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.Keywords: matrix minimization algorithm, decoding sequential search algorithm, image compression, DCT, DWT
Procedia PDF Downloads 15024642 Challenge of Baseline Hydrology Estimation at Large-Scale Watersheds
Authors: Can Liu, Graham Markowitz, John Balay, Ben Pratt
Abstract:
Baseline or natural hydrology is commonly employed for hydrologic modeling and quantification of hydrologic alteration due to manmade activities. It can inform planning and policy related efforts for various state and federal water resource agencies to restore natural streamflow flow regimes. A common challenge faced by hydrologists is how to replicate unaltered streamflow conditions, particularly in large watershed settings prone to development and regulation. Three different methods were employed to estimate baseline streamflow conditions for 6 major subbasins the Susquehanna River Basin; those being: 1) incorporation of consumptive water use and reservoir operations back into regulated gaged records; 2) using a map correlation method and flow duration (exceedance probability) regression equations; 3) extending the pre-regulation streamflow records based on the relationship between concurrent streamflows at unregulated and regulated gage locations. Parallel analyses were perform among the three methods and limitations associated with each are presented. Results from these analyses indicate that generating baseline streamflow records at large-scale watersheds remain challenging, even with long-term continuous stream gage records available.Keywords: baseline hydrology, streamflow gage, subbasin, regression
Procedia PDF Downloads 32424641 Structuring and Visualizing Healthcare Claims Data Using Systems Architecture Methodology
Authors: Inas S. Khayal, Weiping Zhou, Jonathan Skinner
Abstract:
Healthcare delivery systems around the world are in crisis. The need to improve health outcomes while decreasing healthcare costs have led to an imminent call to action to transform the healthcare delivery system. While Bioinformatics and Biomedical Engineering have primarily focused on biological level data and biomedical technology, there is clear evidence of the importance of the delivery of care on patient outcomes. Classic singular decomposition approaches from reductionist science are not capable of explaining complex systems. Approaches and methods from systems science and systems engineering are utilized to structure healthcare delivery system data. Specifically, systems architecture is used to develop a multi-scale and multi-dimensional characterization of the healthcare delivery system, defined here as the Healthcare Delivery System Knowledge Base. This paper is the first to contribute a new method of structuring and visualizing a multi-dimensional and multi-scale healthcare delivery system using systems architecture in order to better understand healthcare delivery.Keywords: health informatics, systems thinking, systems architecture, healthcare delivery system, data analytics
Procedia PDF Downloads 34824640 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering
Authors: Emiel Caron
Abstract:
Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics
Procedia PDF Downloads 19424639 A Low-Power, Low-Noise and High-Gain 58~66 GHz CMOS Receiver Front-End for Short-Range High-Speed Wireless Communications
Authors: Yo-Sheng Lin, Jen-How Lee, Chien-Chin Wang
Abstract:
A 60-GHz receiver front-end using standard 90-nm CMOS technology is reported. The receiver front-end comprises a wideband low-noise amplifier (LNA), and a double-balanced Gilbert cell mixer with a current-reused RF single-to-differential (STD) converter, an LO Marchand balun and a baseband amplifier. The receiver front-end consumes 34.4 mW and achieves LO-RF isolation of 60.7 dB, LO-IF isolation of 45.3 dB and RF-IF isolation of 41.9 dB at RF of 60 GHz and LO of 59.9 GHz. At IF of 0.1 GHz, the receiver front-end achieves maximum conversion gain (CG) of 26.1 dB at RF of 64 GHz and CG of 25.2 dB at RF of 60 GHz. The corresponding 3-dB bandwidth of RF is 7.3 GHz (58.4 GHz to 65.7 GHz). The measured minimum noise figure was 5.6 dB at 64 GHz, one of the best results ever reported for a 60 GHz CMOS receiver front-end. In addition, the measured input 1-dB compression point and input third-order inter-modulation point are -33.1 dBm and -23.3 dBm, respectively, at 60 GHz. These results demonstrate the proposed receiver front-end architecture is very promising for 60 GHz direct-conversion transceiver applications.Keywords: CMOS, 60 GHz, direct-conversion transceiver, LNA, down-conversion mixer, marchand balun, current-reused
Procedia PDF Downloads 45224638 Eradication of Gram-Positive Bacteria by Photosensitizers Immobilized in Polymers
Authors: Marina Nisnevitch, Anton Valkov, Faina Nakonechny, Kate Adar Raik, Yamit Mualem
Abstract:
Photosensitizers are dye compounds belonging to various chemical groups that in all the cases have a developed structure of conjugated double bonds. Under illumination with visible light, the photosensitizers are excited and transfer the absorbed energy to the oxygen dissolved in an aqueous phase, leading to production of a reactive oxygen species which cause irreversible damage to bacterial cells. When immobilized onto a solid phase, photosensitizers preserve their antibacterial properties. In the present study, photosensitizers were immobilized in polyethylene or propylene and tested for antimicrobial activity against Gram-positive S. aureus, S. epidermidis and Streptococcus sp. For this purpose, water-soluble photosensitizers, Rose Bengal sodium salt, and methylene blue as well as water-insoluble hematoporphyrin and Rose Bengal lactone, were immobilized by dissolution in melted polymers to yield 3 mm diameter rods and 3-5 mm beads. All four photosensitizers were found to be effective in the eradication of Gram-positive bacteria under illumination by a white luminescent lamp or sunlight. The immobilized photosensitizers can be applied for continuous water disinfection; they can be easily removed at the end of the treatment and reused.Keywords: antimicrobial polymers, gram-positive bacteria, immobilization of photosensitizers, photodynamic antibacterial activity
Procedia PDF Downloads 24224637 Templating Copper on Polymer/DNA Hybrid Nanowires
Authors: Mahdi Almaky, Reda Hassanin, Benjamin Horrocks, Andrew Houlton
Abstract:
DNA-templated poly(N-substituted pyrrole)bipyridinium nanowires were synthesised at room temperature using the chemical oxidation method. The resulting CPs/DNA hybrids have been characterised using electronic and vibrational spectroscopic methods especially Ultraviolet-Visible (UV-Vis) spectroscopy and FTIR spectroscpy. The nanowires morphology was characterised using Atomic Force Microscopy (AFM). The electrical properties of the prepared nanowires were characterised using Electrostatic Force Microscopy (EFM), and measured using conductive AFM (c-AFM) and two terminal I/V technique, where the temperature dependence of the conductivity was probed. The conductivities of the prepared CPs/DNA nanowires are generally lower than PPy/DNA nanowires showingthe large effect on N-alkylation in decreasing the conductivity of the polymer, butthese are higher than the conductivity of their corresponding bulk films.This enhancement in conductivity could be attributed to the ordering of the polymer chains on DNA during the templating process. The prepared CPs/DNA nanowires were used as templates for the growth of copper nanowires at room temperature using aqueous solution of Cu(NO3)2as a source of Cu2+ and ascorbic acid as reducing agent. AFM images showed that these nanowires were uniform and continuous compared to copper nanowires prepared using the templating method directly onto DNA. Electrical characterization of the nanowires by c AFM revealed slight improvement in conductivity of these nanowires (Cu-CPs/DNA) compared to CPs/DNA nanowires before metallisation.Keywords: templating, copper nanowires, polymer/DNA hybrid, chemical oxidation method
Procedia PDF Downloads 363