Search results for: graph representation of circuit networks
967 Effective Supply Chain Coordination with Hybrid Demand Forecasting Techniques
Authors: Gurmail Singh
Abstract:
Effective supply chain is the main priority of every organization which is the outcome of strategic corporate investments with deliberate management action. Value-driven supply chain is defined through development, procurement and by configuring the appropriate resources, metrics and processes. However, responsiveness of the supply chain can be improved by proper coordination. So the Bullwhip effect (BWE) and Net stock amplification (NSAmp) values were anticipated and used for the control of inventory in organizations by both discrete wavelet transform-Artificial neural network (DWT-ANN) and Adaptive Network-based fuzzy inference system (ANFIS). This work presents a comparative methodology of forecasting for the customers demand which is non linear in nature for a multilevel supply chain structure using hybrid techniques such as Artificial intelligence techniques including Artificial neural networks (ANN) and Adaptive Network-based fuzzy inference system (ANFIS) and Discrete wavelet theory (DWT). The productiveness of these forecasting models are shown by computing the data from real world problems for Bullwhip effect and Net stock amplification. The results showed that these parameters were comparatively less in case of discrete wavelet transform-Artificial neural network (DWT-ANN) model and using Adaptive network-based fuzzy inference system (ANFIS).Keywords: bullwhip effect, hybrid techniques, net stock amplification, supply chain flexibility
Procedia PDF Downloads 127966 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation
Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk
Abstract:
The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set
Procedia PDF Downloads 219965 Flood Mapping Using Height above the Nearest Drainage Model: A Case Study in Fredericton, NB, Canada
Authors: Morteza Esfandiari, Shabnam Jabari, Heather MacGrath, David Coleman
Abstract:
Flood is a severe issue in different places in the world as well as the city of Fredericton, New Brunswick, Canada. The downtown area of Fredericton is close to the Saint John River, which is susceptible to flood around May every year. Recently, the frequency of flooding seems to be increased, especially after the fact that the downtown area and surrounding urban/agricultural lands got flooded in two consecutive years in 2018 and 2019. In order to have an explicit vision of flood span and damage to affected areas, it is necessary to use either flood inundation modelling or satellite data. Due to contingent availability and weather dependency of optical satellites, and limited existing data for the high cost of hydrodynamic models, it is not always feasible to rely on these sources of data to generate quality flood maps after or during the catastrophe. Height Above the Nearest Drainage (HAND), a state-of-the-art topo-hydrological index, normalizes the height of a basin based on the relative elevation along with the stream network and specifies the gravitational or the relative drainage potential of an area. HAND is a relative height difference between the stream network and each cell on a Digital Terrain Model (DTM). The stream layer is provided through a multi-step, time-consuming process which does not always result in an optimal representation of the river centerline depending on the topographic complexity of that region. HAND is used in numerous case studies with quite acceptable and sometimes unexpected results because of natural and human-made features on the surface of the earth. Some of these features might cause a disturbance in the generated model, and consequently, the model might not be able to predict the flow simulation accurately. We propose to include a previously existing stream layer generated by the province of New Brunswick and benefit from culvert maps to improve the water flow simulation and accordingly the accuracy of HAND model. By considering these parameters in our processing, we were able to increase the accuracy of the model from nearly 74% to almost 92%. The improved model can be used for generating highly accurate flood maps, which is necessary for future urban planning and flood damage estimation without any need for satellite imagery or hydrodynamic computations.Keywords: HAND, DTM, rapid floodplain, simplified conceptual models
Procedia PDF Downloads 151964 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks
Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone
Abstract:
Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.Keywords: artificial neural network, data mining, electroencephalogram, epilepsy, feature extraction, seizure detection, signal processing
Procedia PDF Downloads 188963 Architectural Identity in Manifestation of Tall-buildings' Design
Authors: Huda Arshadlamphon
Abstract:
Advancing frontiers of technology and industry is moving rapidly fast influenced by the economic and political phenomena. One vital phenomenon,which has had consolidated the world to a one single village, is Globalization. In response, architecture and the built-environment have faced numerous changes, adjustments, and developments. Tall-buildings, as a product of globalization, represent prestigious icons, symbols, and landmarks for highly economics and advanced countries. Despite the fact, this trend has been encountering several design challenges incorporating architectural identity, traditions, and characteristics that enhance the built-environments' sociocultural values and traditions. The necessity of these values and traditionsform self-solitarily, leading to visual and spatial creativity, independency, and individuality. In other words, they maintain the inherited identity and avoid replications in all means and aspects. This paper, firstly, defines globalization phenomenon, architectural identity, and the concerns of sociocultural values in relation to the traditional characteristics of the built-environment. Secondly, through three case-studies of tall-buildings located in Jeddah city, Saudi Arabia, the Queen's Building, the National Commercial Bank Building (NCB), and the Islamic Development Bank Building; design strategies and methodologies in acclimating architectural identity and characteristics in tall-buildings are discussed. The case-studies highlight buildings' sites and surroundings, concepts and inspirations, design elements, architectural forms and compositions, characteristics, issues, barriers, and trammels facing the designs' decisions, representation of facades, and selection of materials and colors. Furthermore, the research will elucidate briefs of the dominant factors that shape the architectural identity of Jeddah city. In conclusion, the study manifests four tall-buildings' design standards guideline in preserving and developing architectural identity in Jeddah city; the scale of urban and natural environment, the scale of architectural design elements, the integration of visual images, and the creation of spatial scenes and scenarios. The prosed guideline will encourage the development of architectural identity aligned with zeitgeist demands and requirements, supports the contemporary architectural movement toward tall-buildings, and shoresself-solitarily in representing sociocultural values and traditions of the built-environment.Keywords: architectural identity, built-environment, globalization, sociocultural values and traditions, tall-buildings
Procedia PDF Downloads 163962 Intelligent Agent-Based Model for the 5G mmWave O2I Technology Adoption
Authors: Robert Joseph M. Licup
Abstract:
The deployment of the fifth-generation (5G) mobile system through mmWave frequencies is the new solution in the requirement to provide higher bandwidth readily available for all users. The usage pattern of the mobile users has moved towards either the work from home or online classes set-up because of the pandemic. Previous mobile technologies can no longer meet the high speed, and bandwidth requirement needed, given the drastic shift of transactions to the home. The millimeter-wave (mmWave) underutilized frequency is utilized by the fifth-generation (5G) cellular networks that support multi-gigabit-per-second (Gbps) transmission. However, due to its short wavelengths, high path loss, directivity, blockage sensitivity, and narrow beamwidth are some of the technical challenges that need to be addressed. Different tools, technologies, and scenarios are explored to support network design, accurate channel modeling, implementation, and deployment effectively. However, there is a big challenge on how the consumer will adopt this solution and maximize the benefits offered by the 5G Technology. This research proposes to study the intricacies of technology diffusion, individual attitude, behaviors, and how technology adoption will be attained. The agent based simulation model shaped by the actual applications, technology solution, and related literature was used to arrive at a computational model. The research examines the different attributes, factors, and intricacies that can affect each identified agent towards technology adoption.Keywords: agent-based model, AnyLogic, 5G O21, 5G mmWave solutions, technology adoption
Procedia PDF Downloads 109961 Using Crowd-Sourced Data to Assess Safety in Developing Countries: The Case Study of Eastern Cairo, Egypt
Authors: Mahmoud Ahmed Farrag, Ali Zain Elabdeen Heikal, Mohamed Shawky Ahmed, Ahmed Osama Amer
Abstract:
Crowd-sourced data refers to data that is collected and shared by a large number of individuals or organizations, often through the use of digital technologies such as mobile devices and social media. The shortage in crash data collection in developing countries makes it difficult to fully understand and address road safety issues in these regions. In developing countries, crowd-sourced data can be a valuable tool for improving road safety, particularly in urban areas where the majority of road crashes occur. This study is -to our best knowledge- the first to develop safety performance functions using crowd-sourced data by adopting a negative binomial structure model and the Full Bayes model to investigate traffic safety for urban road networks and provide insights into the impact of roadway characteristics. Furthermore, as a part of the safety management process, network screening has been undergone through applying two different methods to rank the most hazardous road segments: PCR method (adopted in the Highway Capacity Manual HCM) as well as a graphical method using GIS tools to compare and validate. Lastly, recommendations were suggested for policymakers to ensure safer roads.Keywords: crowdsourced data, road crashes, safety performance functions, Full Bayes models, network screening
Procedia PDF Downloads 52960 Effectiveness of Opuntia ficus indica Cladodes Extract for Wound-Healing
Authors: Giuffrida Graziella, Pennisi Stefania, Coppa Federica, Iannello Giulia, Cartelli Simone, Lo Faro Riccardo, Ferruggia Greta, Brundo Maria Violetta
Abstract:
Cladode chemical composition may vary according to soil factors, cultivation season, and plant age. The primary metabolites of cladodes are water, carbohydrates, and proteins. The carbohydrates in cladodes are divided into two types: structural and storage. Polysaccharides from Opuntia ficus‐indica (L.) Mill plants build molecular networks with the capacity to retain water; thus, they act as mucoprotective agents. Mucilage is the main polysaccharide of cladodes; it contains polymers of β‐d‐galacturonic acid bound in positions (1–4) and traces of R‐linked l‐rhamnose (1-2). Mucilage regulates both the cell water content during prolonged drought and the calcium flux in the plant cells. The in vitro analysis of keratinocytes in monolayer, through the scratch-wound-healing assay, provided promising results. After 48 hours of exposure, the wound scratch was almost completely closed in cells treated with cladode extract. After 72 hours, the treated cells reached complete confluence, while in the untreated cells (negative control) the confluence was reached after 96 hours. We also added a positive control group of cells treated with colchicine, which inhibited wound closure for a more comprehensive analysis.Keywords: cladodes, metabolites, polysaccharide, scratch-wound-healing assay
Procedia PDF Downloads 55959 Reduce the Impact of Wildfires by Identifying Them Early from Space and Sending Location Directly to Closest First Responders
Authors: Gregory Sullivan
Abstract:
The evolution of global warming has escalated the number and complexity of forest fires around the world. As an example, the United States and Brazil combined generated more than 30,000 forest fires last year. The impact to our environment, structures and individuals is incalculable. The world has learned to try to take this in stride, trying multiple ways to contain fires. Some countries are trying to use cameras in limited areas. There are discussions of using hundreds of low earth orbit satellites and linking them together, and, interfacing them through ground networks. These are all truly noble attempts to defeat the forest fire phenomenon. But there is a better, simpler answer. A bigger piece of the solutions puzzle is to see the fires while they are small, soon after initiation. The approach is to see the fires while they are very small and report their location (latitude and longitude) to local first responders. This is done by placing a sensor at geostationary orbit (GEO: 26,000 miles above the earth). By placing this small satellite in GEO, we can “stare” at the earth, and sense temperature changes. We do not “see” fires, but “measure” temperature changes. This has already been demonstrated on an experimental scale. Fires were seen at close to initiation, and info forwarded to first responders. it were the first to identify the fires 7 out of 8 times. The goal is to have a small independent satellite at GEO orbit focused only on forest fire initiation. Thus, with one small satellite, focused only on forest fire initiation, we hope to greatly decrease the impact to persons, property and the environment.Keywords: space detection, wildfire early warning, demonstration wildfire detection and action from space, space detection to first responders
Procedia PDF Downloads 70958 Resilience of Infrastructure Networks: Maintenance of Bridges in Mountainous Environments
Authors: Lorenza Abbracciavento, Valerio De Biagi
Abstract:
Infrastructures are key elements to ensure the operational functionality of the transport system. The collapse of a single bridge or, equivalently, a tunnel can leads an entire motorway to be considered completely inaccessible. As a consequence, the paralysis of the communications network determines several important drawbacks for the community. Recent chronicle events have demonstrated that ensuring the functional continuity of the strategic infrastructures during and after a catastrophic event makes a significant difference in terms of life and economical losses. Moreover, it has been observed that RC structures located in mountain environments show a worst state of conservation compared to the same typology and aging structures located in temperate climates. Because of its morphology, in fact, the mountain environment is particularly exposed to severe collapse and deterioration phenomena, generally: natural hazards, e.g. rock falls, and meteorological hazards, e.g. freeze-thaw cycles or heavy snows. For these reasons, deep investigation on the characteristics of these processes becomes of fundamental importance to provide smart and sustainable solutions and make the infrastructure system more resilient. In this paper, the design of a monitoring system in mountainous environments is presented and analyzed in its parts. The method not only takes into account the peculiar climatic conditions, but it is integrated and interacts with the environment surrounding.Keywords: structural health monitoring, resilience of bridges, mountain infrastructures, infrastructural network, maintenance
Procedia PDF Downloads 77957 The Lighthouse Project: Recent Initiatives to Navigate Australian Families Safely Through Parental Separation
Authors: Kathryn McMillan
Abstract:
A recent study of 8500 adult Australians aged 16 and over revealed 62% had experienced childhood maltreatment. In response to multiple recommendations by bodies such as the Australian Law Reform Commission, parliamentary reports and stakeholder input, a number of key initiatives have been developed to grapple with the difficulties of a federal-state system and to screen and triage high-risk families navigating their way through the court system. The Lighthouse Project (LHP) is a world-first initiative of the Federal Circuit and Family Courts in Australia (FCFOCA) to screen family law litigants for major risk factors, including family violence, child abuse, alcohol or substance abuse and mental ill-health at the point of filing in all applications that seek parenting orders. It commenced on 7 December 2020 on a pilot basis but has now been expanded to 15 registries across the country. A specialist risk screen, Family DOORS, Triage has been developed – focused on improving the safety and wellbeing of families involved in the family law system safety planning and service referral, and ¬ differentiated case management based on risk level, with the Evatt List specifically designed to manage the highest risk cases. Early signs are that this approach is meeting the needs of families with multiple risks moving through the Court system. Before the LHP, there was no data available about the prevalence of risk factors experienced by litigants entering the family courts and it was often assumed that it was the litigation process that was fueling family violence and other risks such as suicidality. Data from the 2022 FCFCOA annual report indicated that in parenting proceedings, 70% alleged a child had been or was at risk of abuse, 80% alleged a party had experienced Family Violence, 74 % of children had been exposed to Family Violence, 53% alleged through substance misuse by party children had caused or was at risk of causing harm to children and 58% of matters allege mental health issues of a party had caused or placed a child at risk of harm. Those figures reveal the significant overlap between child protection and family violence, both of which are under the responsibility of state and territory governments. Since 2020, a further key initiative has been the co-location of child protection and police officials amongst a number of registries of the FCFOCA. The ability to access in a time-effective way details of family violence or child protection orders, weapons licenses, criminal convictions or proceedings is key to managing issues across the state and federal divide. It ensures a more cohesive and effective response to family law, family violence and child protection systems.Keywords: child protection, family violence, parenting, risk screening, triage.
Procedia PDF Downloads 77956 Elaboration and Physico-Chemical Characterization of Edible Films Made from Chitosan and Spray Dried Ethanolic Extracts of Propolis
Authors: David Guillermo Piedrahita Marquez, Hector Suarez Mahecha, Jairo Humberto Lopez
Abstract:
It was necessary to establish which formulation is suitable for the preservation of aquaculture products, that why edible films were made. These were to a characterization in order to meet their morphology physicochemical and mechanical properties, optical. Six Formulations of chitosan and propolis ethanolic extract encapsulated were developed because of their activity against pathogens and due to their properties, which allows the creation waterproof polymer networks against gasses, vapor, and physical damage. In the six Formulations, the concentration of comparison material (1% w/v, 2% pv) and the bioactive concentrations (0.5% w/v, 1% w/v, 1.5% pv) were changed and the results obtained were compared with statistical and multivariate analysis methods. It was observed that the matrices showed a mayor impermeability and thickness control samples and the samples reported in the literature. Also, these films showed a notorious uniformity of the films and a bigger resistance to the physical damage compared with other edible films made of other biopolymers. However the action of some compounds had a negative effect on the mechanical properties and changed drastically the optical properties, the bioactive has an effect on Polymer Matrix and it was determined that the films with 2% w / v of chitosan and 1.5% w/v encapsulated, exhibited the best properties and suffered to a lesser extent the negative impact of immiscible substances.Keywords: chitosan, edible films, ethanolic extract of propolis, mechanical properties, optical properties, physical characterization, scanning electron microscopy (SEM)
Procedia PDF Downloads 446955 Multi-Agent Searching Adaptation Using Levy Flight and Inferential Reasoning
Authors: Sagir M. Yusuf, Chris Baber
Abstract:
In this paper, we describe how to achieve knowledge understanding and prediction (Situation Awareness (SA)) for multiple-agents conducting searching activity using Bayesian inferential reasoning and learning. Bayesian Belief Network was used to monitor agents' knowledge about their environment, and cases are recorded for the network training using expectation-maximisation or gradient descent algorithm. The well trained network will be used for decision making and environmental situation prediction. Forest fire searching by multiple UAVs was the use case. UAVs are tasked to explore a forest and find a fire for urgent actions by the fire wardens. The paper focused on two problems: (i) effective agents’ path planning strategy and (ii) knowledge understanding and prediction (SA). The path planning problem by inspiring animal mode of foraging using Lévy distribution augmented with Bayesian reasoning was fully described in this paper. Results proof that the Lévy flight strategy performs better than the previous fixed-pattern (e.g., parallel sweeps) approaches in terms of energy and time utilisation. We also introduced a waypoint assessment strategy called k-previous waypoints assessment. It improves the performance of the ordinary levy flight by saving agent’s resources and mission time through redundant search avoidance. The agents (UAVs) are to report their mission knowledge at the central server for interpretation and prediction purposes. Bayesian reasoning and learning were used for the SA and results proof effectiveness in different environments scenario in terms of prediction and effective knowledge representation. The prediction accuracy was measured using learning error rate, logarithm loss, and Brier score and the result proves that little agents mission that can be used for prediction within the same or different environment. Finally, we described a situation-based knowledge visualization and prediction technique for heterogeneous multi-UAV mission. While this paper proves linkage of Bayesian reasoning and learning with SA and effective searching strategy, future works is focusing on simplifying the architecture.Keywords: Levy flight, distributed constraint optimization problem, multi-agent system, multi-robot coordination, autonomous system, swarm intelligence
Procedia PDF Downloads 144954 An Efficient Subcarrier Scheduling Algorithm for Downlink OFDMA-Based Wireless Broadband Networks
Authors: Hassen Hamouda, Mohamed Ouwais Kabaou, Med Salim Bouhlel
Abstract:
The growth of wireless technology made opportunistic scheduling a widespread theme in recent research. Providing high system throughput without reducing fairness allocation is becoming a very challenging task. A suitable policy for resource allocation among users is of crucial importance. This study focuses on scheduling multiple streaming flows on the downlink of a WiMAX system based on orthogonal frequency division multiple access (OFDMA). In this paper, we take the first step in formulating and analyzing this problem scrupulously. As a result, we proposed a new scheduling scheme based on Round Robin (RR) Algorithm. Because of its non-opportunistic process, RR does not take in account radio conditions and consequently it affect both system throughput and multi-users diversity. Our contribution called MORRA (Modified Round Robin Opportunistic Algorithm) consists to propose a solution to this issue. MORRA not only exploits the concept of opportunistic scheduler but also takes into account other parameters in the allocation process. The first parameter is called courtesy coefficient (CC) and the second is called Buffer Occupancy (BO). Performance evaluation shows that this well-balanced scheme outperforms both RR and MaxSNR schedulers and demonstrate that choosing between system throughput and fairness is not required.Keywords: OFDMA, opportunistic scheduling, fairness hierarchy, courtesy coefficient, buffer occupancy
Procedia PDF Downloads 300953 Wind Generator Control in Isolated Site
Authors: Glaoui Hachemi
Abstract:
Wind has been proven as a cost effective and reliable energy source. Technological advancements over the last years have placed wind energy in a firm position to compete with conventional power generation technologies. Algeria has a vast uninhabited land area where the south (desert) represents the greatest part with considerable wind regime. In this paper, an analysis of wind energy utilization as a viable energy substitute in six selected sites widely distributed all over the south of Algeria is presented. In this presentation, wind speed frequency distributions data obtained from the Algerian Meteorological Office are used to calculate the average wind speed and the available wind power. The annual energy produced by the Fuhrlander FL 30 wind machine is obtained using two methods. The analysis shows that in the southern Algeria, at 10 m height, the available wind power was found to vary between 160 and 280 W/m2, except for Tamanrasset. The highest potential wind power was found at Adrar, with 88 % of the time the wind speed is above 3 m/s. Besides, it is found that the annual wind energy generated by that machine lie between 33 and 61 MWh, except for Tamanrasset, with only 17 MWh. Since the wind turbines are usually installed at a height greater than 10 m, an increased output of wind energy can be expected. However, the wind resource appears to be suitable for power production on the south and it could provide a viable substitute to diesel oil for irrigation pumps and electricity generation. In this paper, a model of the wind turbine (WT) with permanent magnet generator (PMSG) and its associated controllers is presented. The increase of wind power penetration in power systems has meant that conventional power plants are gradually being replaced by wind farms. In fact, today wind farms are required to actively participate in power system operation in the same way as conventional power plants. In fact, power system operators have revised the grid connection requirements for wind turbines and wind farms, and now demand that these installations be able to carry out more or less the same control tasks as conventional power plants. For dynamic power system simulations, the PMSG wind turbine model includes an aerodynamic rotor model, a lumped mass representation of the drive train system and generator model. In this paper, we propose a model with an implementation in MATLAB / Simulink, each of the system components off-grid small wind turbines.Keywords: windgenerator systems, permanent magnet synchronous generator (PMSG), wind turbine (WT) modeling, MATLAB simulink environment
Procedia PDF Downloads 338952 A Non-Destructive Estimation Method for Internal Time in Perilla Leaf Using Hyperspectral Data
Authors: Shogo Nagano, Yusuke Tanigaki, Hirokazu Fukuda
Abstract:
Vegetables harvested early in the morning or late in the afternoon are valued in plant production, and so the time of harvest is important. The biological functions known as circadian clocks have a significant effect on this harvest timing. The purpose of this study was to non-destructively estimate the circadian clock and so construct a method for determining a suitable harvest time. We took eight samples of green busil (Perilla frutescens var. crispa) every 4 hours, six times for 1 day and analyzed all samples at the same time. A hyperspectral camera was used to collect spectrum intensities at 141 different wavelengths (350–1050 nm). Calculation of correlations between spectrum intensity of each wavelength and harvest time suggested the suitability of the hyperspectral camera for non-destructive estimation. However, even the highest correlated wavelength had a weak correlation, so we used machine learning to raise the accuracy of estimation and constructed a machine learning model to estimate the internal time of the circadian clock. Artificial neural networks (ANN) were used for machine learning because this is an effective analysis method for large amounts of data. Using the estimation model resulted in an error between estimated and real times of 3 min. The estimations were made in less than 2 hours. Thus, we successfully demonstrated this method of non-destructively estimating internal time.Keywords: artificial neural network (ANN), circadian clock, green busil, hyperspectral camera, non-destructive evaluation
Procedia PDF Downloads 299951 The Participation of Graduates and Students of Social Work in the Erasmus Program: a Case Study in the Portuguese context – the Polytechnic of Leiria
Authors: Cezarina da Conceição Santinho Maurício, José Duque Vicente
Abstract:
Established in 1987, the Erasmus Programme is a program for the exchange of higher education students. Its purposes are several. The mobility developed has contributed to the promotion of multiple learning, the internalization the feeling of belonging to a community, and the consolidation of cooperation between entities or universities. It also allows the experience of a European experience, considering multilingualism one of the bases of the European project and vehicle to achieve the union in diversity. The program has progressed and introduced changes Erasmus+ currently offers a wide range of opportunities for higher education, vocational education and training, school education, adult education, youth, and sport. These opportunities are open to students and other stakeholders, such as teachers. Portugal was one of the countries that readily adhered to this program, assuming itself as an instrument of internationalization of polytechnic and university higher education. Students and social work teachers have been involved in this mobility of learning and multicultural interactions. The presence and activation of this program was made possible by Portugal's joining the European Union. This event was reflected in the field of portuguese social work and contributes to its approach to the reality of european social work. Historically, the Portuguese social work has built a close connection with the Latin American world and, in particular, with Brazil. There are several examples that can be identified in the different historical stages. This is the case of the post-revolution period of 1974 and the presence of the reconceptualization movement, the struggle for enrollment in the higher education circuit, the process of winning a bachelor's degree, and postgraduate training (the first doctorates of social work were carried out in Brazilian universities). This influence is also found in the scope of the authors and the theoretical references used. This study examines the participation of graduates and students of social work in the Erasmus program. The following specific goals were outlined: to identify the host countries and universities; to investigate the dimension and type of mobility made, understand the learning and experiences acquired, identify the difficulties felt, capture their perspectives on social work and the contribution of this experience in training. In the methodological field, the option fell on a qualitative methodology, with the application of semi-structured interviews to graduates and students of social work with Erasmus mobility experience. Once the graduates agreed, the interviews were recorded and transcribed, analyzed according to the previously defined analysis categories. The findings emphasize the importance of this experience for students and graduates in informal and formal learning. The authors conclude with recommendations to reinforce this mobility, either at the individual level or as a project built for the group or collective.Keywords: erasmus programme, graduates and students of social work, participation, social work
Procedia PDF Downloads 149950 Language in Court: Ideology, Power and Cognition
Authors: Mehdi Damaliamiri
Abstract:
Undoubtedly, the power of language is hardly a new topic; indeed, the persuasive power of language accompanied by ideology has long been recognized in different aspects of life. The two and a half thousand-year-old Bisitun inscriptions in Iran, proclaiming the victories of the Persian King, Darius, are considered by some historians to have been an early example of the use of propaganda. Added to this, the modern age is the true cradle of fully-fledged ideologies and the ongoing process of centrifugal ideologization. The most visible work on ideology today within the field of linguistics is “Critical Discourse Analysis” (CDA). The focus of CDA is on “uncovering injustice, inequality, taking sides with the powerless and suppressed” and making “mechanisms of manipulation, discrimination, demagogy, and propaganda explicit and transparent.” possible way of relating language to ideology is to propose that ideology and language are inextricably intertwined. From this perspective, language is always ideological, and ideology depends on the language. All language use involves ideology, and so ideology is ubiquitous – in our everyday encounters, as much as in the business of the struggle for power within and between the nation-states and social statuses. At the same time, ideology requires language. Its key characteristics – its power and pervasiveness, its mechanisms for continuity and for change – all come out of the inner organization of language. The two phenomena are homologous: they share the same evolutionary trajectory. To get a more robust portrait of the power and ideology, we need to examine its potential place in the structure, and consider how such structures pattern in terms of the functional elements which organize meanings in the clause. This is based on the belief that all grammatical, including syntactic, knowledge is stored mentally as constructions have become immensely popular. When the structure of the clause is taken into account, the power and ideology have a preference for Complement over Subject and Adjunct. The subject is a central interpersonal element in discourse: it is one of two elements that form the central interactive nub of a proposition. Conceptually, there are countless ways of construing a given event and linguistically, a variety of grammatical devices that are usually available as alternate means of coding a given conception, such as political crime and corruption. In the theory of construal, then, which, like transitivity in Halliday, makes options available, Cognitive Linguistics can offer a cognitive account of ideology in language, where ideology is made possible by the choices a language allows for representing the same material situation in different ways. The possibility of promoting alternative construals of the same reality means that any particular choice in representation is always ideologically constrained or motivated and indicates the perspective and interests of the text-producer.Keywords: power, ideology, court, discourse
Procedia PDF Downloads 163949 Localization of Buried People Using Received Signal Strength Indication Measurement of Wireless Sensor
Authors: Feng Tao, Han Ye, Shaoyi Liao
Abstract:
City constructions collapse after earthquake and people will be buried under ruins. Search and rescue should be conducted as soon as possible to save them. Therefore, according to the complicated environment, irregular aftershocks and rescue allow of no delay, a kind of target localization method based on RSSI (Received Signal Strength Indication) is proposed in this article. The target localization technology based on RSSI with the features of low cost and low complexity has been widely applied to nodes localization in WSN (Wireless Sensor Networks). Based on the theory of RSSI transmission and the environment impact to RSSI, this article conducts the experiments in five scenes, and multiple filtering algorithms are applied to original RSSI value in order to establish the signal propagation model with minimum test error respectively. Target location can be calculated from the distance, which can be estimated from signal propagation model, through improved centroid algorithm. Result shows that the localization technology based on RSSI is suitable for large-scale nodes localization. Among filtering algorithms, mixed filtering algorithm (average of average, median and Gaussian filtering) performs better than any other single filtering algorithm, and by using the signal propagation model, the minimum error of distance between known nodes and target node in the five scene is about 3.06m.Keywords: signal propagation model, centroid algorithm, localization, mixed filtering, RSSI
Procedia PDF Downloads 300948 Species Distribution Modelling for Assessing the Effect of Land Use Changes on the Habitat of Endangered Proboscis Monkey (Nasalis larvatus) in Kalimantan, Indonesia
Authors: Wardatutthoyyibah, Satyawan Pudyatmoko, Sena Adi Subrata, Muhammad Ali Imron
Abstract:
The proboscis monkey is an endemic species to the island of Borneo with conservation status IUCN (The International Union for Conservation of Nature) of endangered. The population of the monkey has a specific habitat and sensitive to habitat disturbances. As a consequence of increasing rates of land-use change in the last four decades, its population was reported significantly decreased. We quantified the effect of land use change on the proboscis monkey’s habitat through the species distribution modeling (SDM) approach with Maxent Software. We collected presence data and environmental variables, i.e., land cover, topography, bioclimate, distance to the river, distance to the road, and distance to the anthropogenic disturbance to generate predictive distribution maps of the monkeys. We compared two prediction maps for 2000 and 2015 data to represent the current habitat of the monkey. We overlaid the monkey’s predictive distribution map with the existing protected areas to investigate whether the habitat of the monkey is protected under the protected areas networks. The results showed that almost 50% of the monkey’s habitat reduced as the effect of land use change. And only 9% of the current proboscis monkey’s habitat within protected areas. These results are important for the master plan of conservation of the endangered proboscis monkey and provide scientific guidance for the future development incorporating biodiversity issue.Keywords: endemic species, land use change, maximum entropy, spatial distribution
Procedia PDF Downloads 158947 Toxin-Producing Algae of Nigerian Coast, Gulf of Guinea
Authors: Medina O. Kadiri, Jeffrey U. Ogbebor
Abstract:
Toxin-producing algae are algal species that produce potent toxins, which accumulate in food chains and cause various gastrointestinal and neurological illnesses in humans and other animals. They result in shellfish toxicity, ecosystem alteration, cause fish kills and mortality of other animals and humans, in addition to compromised product quality as well as decreased consumer confidence. Animals, including man, are directly exposed to toxins by absorbing toxins from the water via swimming, drinking water with toxins, or ingestion of algal species via feeding on contaminated seafood. These toxins, algal toxins, undergo bioaccumulation, biotransformation, biotransferrence, and biomagnification through the natural food chains and food webs, thereby endangering animals and humans. The Nigerian coast is situated on the Atlantic Ocean, the Gulf of Guinea, one of Africa’s five large marine ecosystems (LME), and studies on toxic algae in this ecosystem are generally lacking. Algal samples were collected from eight coastal states and ten locations spanning the Bight of Bonny and the Bight of Benin. A total of 70 species of toxin-producing algae were found in the coastal waters of Nigeria. There was a great variety of toxin-producing algae in the coastal waters of Nigeria. They were Domoic acid-producing forms (DSP), Saxitoxin-producing, Gonyautoxin-producing, and Yessotoxin-producing (all PSP). Others were Okadaic acid-producing, Dinophysistoxin-producing, and Palytoxin-producing, which are representatives of DSP; CFP was represented by Ciguatoxin-producing forms and NSP by Brevitoxin-producing species. Emerging or new toxins are comprising of Gymnodimines, Spirolides, Palytoxins, and Prorocentrolidess-producing algae. The CyanoToxin Poisoning (CTP) was represented by Anatoxin-, Microcystin-, Cylindrospermopsis-Lyngbyatoxin-, Nordularin-Applyssiatoxin and Debromoapplatoxin-producing species. The highest group was the Saxitoxin-producing species, followed by Microcystin-producing species, then Anatoxin-producing species. Gonyautoxin (PSP), Palytoxin (DSP), Emerging toxins, and Cylindrospermopsin -producing species had a very substantial representation. Only Ciguatoxin-producing species, Lyngbyatoxin-Nordularin, Applyssiatoxin, and Debromoapplatoxin-producing species were represented by one taxon each. The presence of such overwhelming diversity of toxin-producing algae on the Nigerian coast is a source of concern for fisheries, aquaculture, human health, and ecosystem services. Therefore routine monitoring of toxic and harmful algae is greatly recommended.Keywords: algal syndromes, Atlantic Ocean, harmful algae, Nigeria
Procedia PDF Downloads 207946 Cosmetic Surgery on the Rise: The Impact of Remote Communication
Authors: Bruno Di Pace, Roxanne H. Padley
Abstract:
Aims: The recent increase in remote video interaction has increased the number of requests for teleconsultations with plastic surgeons in private practice (70% in the UK and 64% in the USA). This study investigated the motivations for such an increase and the underlying psychological impact on patients. Method: An anonymous web-based poll of 8 questions was designed and distributed to patients seeking cosmetic surgery through social networks in both Italy and the UK. The questions gathered responses regarding 1. Reasons for pursuing cosmetic surgery; 2. The effects of delays caused by the SARS-COV-2 pandemic; 3. The effects on mood; 4. The influence of video conferencing on body-image perception. Results: 85 respondents completed the online poll. Overall, 68% of respondents stated that seeing themselves more frequently online had influenced their decision to seek cosmetic surgery. The types of surgeries indicated were predominantly to the upper body and face (82%). Delays and access to surgeons during the pandemic were perceived as negatively impacting patients' moods (95%). Body-image perception and self-esteem were lower than in the pre-pandemic, particularly during lockdown (72%). Patients were more inclined to undergo cosmetic surgery during the pandemic, both due to the wish to improve their “lockdown face” for video conferencing (77%) and also due to the benefits of home recovery while in smart working (58%). Conclusions: Overall, findings suggest that video conferencing has led to a significant increase in requests for cosmetic surgery and the so-called “Zoom Boom” effect.Keywords: cosmetic surgery, remote communication, telehealth, zoom boom
Procedia PDF Downloads 179945 A Bi-Objective Model to Optimize the Total Time and Idle Probability for Facility Location Problem Behaving as M/M/1/K Queues
Authors: Amirhossein Chambari
Abstract:
This article proposes a bi-objective model for the facility location problem subject to congestion (overcrowding). Motivated by implementations to locate servers in internet mirror sites, communication networks, one-server-systems, so on. This model consider for situations in which immobile (or fixed) service facilities are congested (or queued) by stochastic demand to behave as M/M/1/K queues. We consider for this problem two simultaneous perspectives; (1) Customers (desire to limit times of accessing and waiting for service) and (2) Service provider (desire to limit average facility idle-time). A bi-objective model is setup for facility location problem with two objective functions; (1) Minimizing sum of expected total traveling and waiting time (customers) and (2) Minimizing the average facility idle-time percentage (service provider). The proposed model belongs to the class of mixed-integer nonlinear programming models and the class of NP-hard problems. In addition, to solve the model, controlled elitist non-dominated sorting genetic algorithms (Controlled NSGA-II) and controlled elitist non-dominated ranking genetic algorithms (NRGA-I) are proposed. Furthermore, the two proposed metaheuristics algorithms are evaluated by establishing standard multiobjective metrics. Finally, the results are analyzed and some conclusions are given.Keywords: bi-objective, facility location, queueing, controlled NSGA-II, NRGA-I
Procedia PDF Downloads 583944 Redefining Identity of People with Disabilities Based on Content Analysis of Instagram Accounts
Authors: Grzegorz Kubinski
Abstract:
The proposed paper is focused on forms of identity expression in people with disabilities (PWD) in the social networks like Instagram. Theoretical analysis widely proposes using the new media as an assistive tool for improving wellbeing and labour activities of PWD. This kind of use is definitely important and plays a key role in all social inclusion processes. However, Instagram is not a place where PWD only express their own problems, but in the opposite, allows them to construct a new definition of disability. In the paper, the problem how this different than a classical approach to disability is created by PWD will be discussed. This issue will be scrutinized mainly in two points. Firstly, the question of how disability is changed by other everyday activities, like fashion or sport, will be described. Secondly, and this could be seen as more important, the point how PWD redefining their bodies creating a different form of aesthetic will be presented. The paper is based on content analysis of Instagram accounts. About 20 accounts created by PWD were analyzed for 6 month period, taking into account elements like photos, comments and discussions. All those information were studied in relation to 'everyday life' category and 'aesthetic' category. Works by T. Siebers, L. J. Davis or R. McRuer were used as theoretical background. Conclusions and interpretations presented in the proposed paper show that the Internet can be used by PWD not only as prosthetic and assistive tools. PWD willingly use them as modes of expression their independence, agency and identity. The paper proposes that in further research this way of using the Internet communication by PWD should be taken into account as an important part of the understanding of disability.Keywords: body, disability, identity, new media
Procedia PDF Downloads 138943 Understanding Cognitive Fatigue From FMRI Scans With Self-supervised Learning
Authors: Ashish Jaiswal, Ashwin Ramesh Babu, Mohammad Zaki Zadeh, Fillia Makedon, Glenn Wylie
Abstract:
Functional magnetic resonance imaging (fMRI) is a neuroimaging technique that records neural activations in the brain by capturing the blood oxygen level in different regions based on the task performed by a subject. Given fMRI data, the problem of predicting the state of cognitive fatigue in a person has not been investigated to its full extent. This paper proposes tackling this issue as a multi-class classification problem by dividing the state of cognitive fatigue into six different levels, ranging from no-fatigue to extreme fatigue conditions. We built a spatio-temporal model that uses convolutional neural networks (CNN) for spatial feature extraction and a long short-term memory (LSTM) network for temporal modeling of 4D fMRI scans. We also applied a self-supervised method called MoCo (Momentum Contrast) to pre-train our model on a public dataset BOLD5000 and fine-tuned it on our labeled dataset to predict cognitive fatigue. Our novel dataset contains fMRI scans from Traumatic Brain Injury (TBI) patients and healthy controls (HCs) while performing a series of N-back cognitive tasks. This method establishes a state-of-the-art technique to analyze cognitive fatigue from fMRI data and beats previous approaches to solve this problem.Keywords: fMRI, brain imaging, deep learning, self-supervised learning, contrastive learning, cognitive fatigue
Procedia PDF Downloads 190942 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform
Authors: Reza Mohammadzadeh
Abstract:
The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.Keywords: data model, geotechnical risks, machine learning, underground coal mining
Procedia PDF Downloads 274941 Assessment of Potential Chemical Exposure to Betamethasone Valerate and Clobetasol Propionate in Pharmaceutical Manufacturing Laboratories
Authors: Nadeen Felemban, Hamsa Banjer, Rabaah Jaafari
Abstract:
One of the most common hazards in the pharmaceutical industry is the chemical hazard, which can cause harm or develop occupational health diseases/illnesses due to chronic exposures to hazardous substances. Therefore, a chemical agent management system is required, including hazard identification, risk assessment, controls for specific hazards and inspections, to keep your workplace healthy and safe. However, routine management monitoring is also required to verify the effectiveness of the control measures. Moreover, Betamethasone Valerate and Clobetasol Propionate are some of the APIs (Active Pharmaceutical Ingredients) with highly hazardous classification-Occupational Hazard Category (OHC 4), which requires a full containment (ECA-D) during handling to avoid chemical exposure. According to Safety Data Sheet, those chemicals are reproductive toxicants (reprotoxicant H360D), which may affect female workers’ health and cause fatal damage to an unborn child, or impair fertility. In this study, qualitative (chemical Risk assessment-qCRA) was conducted to assess the chemical exposure during handling of Betamethasone Valerate and Clobetasol Propionate in pharmaceutical laboratories. The outcomes of qCRA identified that there is a risk of potential chemical exposure (risk rating 8 Amber risk). Therefore, immediate actions were taken to ensure interim controls (according to the Hierarchy of controls) are in place and in use to minimize the risk of chemical exposure. No open handlings should be done out of the Steroid Glove Box Isolator (SGB) with the required Personal Protective Equipment (PPEs). The PPEs include coverall, nitrile hand gloves, safety shoes and powered air-purifying respirators (PAPR). Furthermore, a quantitative assessment (personal air sampling) was conducted to verify the effectiveness of the engineering controls (SGB Isolator) and to confirm if there is chemical exposure, as indicated earlier by qCRA. Three personal air samples were collected using an air sampling pump and filter (IOM2 filters, 25mm glass fiber media). The collected samples were analyzed by HPLC in the BV lab, and the measured concentrations were reported in (ug/m3) with reference to Occupation Exposure Limits, 8hr OELs (8hr TWA) for each analytic. The analytical results are needed in 8hr TWA (8hr Time-weighted Average) to be analyzed using Bayesian statistics (IHDataAnalyst). The results of the Bayesian Likelihood Graph indicate (category 0), which means Exposures are de "minimus," trivial, or non-existent Employees have little to no exposure. Also, these results indicate that the 3 samplings are representative samplings with very low variations (SD=0.0014). In conclusion, the engineering controls were effective in protecting the operators from such exposure. However, routine chemical monitoring is required every 3 years unless there is a change in the processor type of chemicals. Also, frequent management monitoring (daily, weekly, and monthly) is required to ensure the control measures are in place and in use. Furthermore, a Similar Exposure Group (SEG) was identified in this activity and included in the annual health surveillance for health monitoring.Keywords: occupational health and safety, risk assessment, chemical exposure, hierarchy of control, reproductive
Procedia PDF Downloads 173940 An Optimal Control Method for Reconstruction of Topography in Dam-Break Flows
Authors: Alia Alghosoun, Nabil El Moçayd, Mohammed Seaid
Abstract:
Modeling dam-break flows over non-flat beds requires an accurate representation of the topography which is the main source of uncertainty in the model. Therefore, developing robust and accurate techniques for reconstructing topography in this class of problems would reduce the uncertainty in the flow system. In many hydraulic applications, experimental techniques have been widely used to measure the bed topography. In practice, experimental work in hydraulics may be very demanding in both time and cost. Meanwhile, computational hydraulics have served as an alternative for laboratory and field experiments. Unlike the forward problem, the inverse problem is used to identify the bed parameters from the given experimental data. In this case, the shallow water equations used for modeling the hydraulics need to be rearranged in a way that the model parameters can be evaluated from measured data. However, this approach is not always possible and it suffers from stability restrictions. In the present work, we propose an adaptive optimal control technique to numerically identify the underlying bed topography from a given set of free-surface observation data. In this approach, a minimization function is defined to iteratively determine the model parameters. The proposed technique can be interpreted as a fractional-stage scheme. In the first stage, the forward problem is solved to determine the measurable parameters from known data. In the second stage, the adaptive control Ensemble Kalman Filter is implemented to combine the optimality of observation data in order to obtain the accurate estimation of the topography. The main features of this method are on one hand, the ability to solve for different complex geometries with no need for any rearrangements in the original model to rewrite it in an explicit form. On the other hand, its achievement of strong stability for simulations of flows in different regimes containing shocks or discontinuities over any geometry. Numerical results are presented for a dam-break flow problem over non-flat bed using different solvers for the shallow water equations. The robustness of the proposed method is investigated using different numbers of loops, sensitivity parameters, initial samples and location of observations. The obtained results demonstrate high reliability and accuracy of the proposed techniques.Keywords: erodible beds, finite element method, finite volume method, nonlinear elasticity, shallow water equations, stresses in soil
Procedia PDF Downloads 130939 Enhancement of Underwater Haze Image with Edge Reveal Using Pixel Normalization
Authors: M. Dhana Lakshmi, S. Sakthivel Murugan
Abstract:
As light passes from source to observer in the water medium, it is scattered by the suspended particulate matter. This scattering effect will plague the captured images with non-uniform illumination, blurring details, halo artefacts, weak edges, etc. To overcome this, pixel normalization with an Amended Unsharp Mask (AUM) filter is proposed to enhance the degraded image. To validate the robustness of the proposed technique irrespective of atmospheric light, the considered datasets are collected on dual locations. For those images, the maxima and minima pixel intensity value is computed and normalized; then the AUM filter is applied to strengthen the blurred edges. Finally, the enhanced image is obtained with good illumination and contrast. Thus, the proposed technique removes the effect of scattering called de-hazing and restores the perceptual information with enhanced edge detail. Both qualitative and quantitative analyses are done on considering the standard non-reference metric called underwater image sharpness measure (UISM), and underwater image quality measure (UIQM) is used to measure color, sharpness, and contrast for both of the location images. It is observed that the proposed technique has shown overwhelming performance compared to other deep-based enhancement networks and traditional techniques in an adaptive manner.Keywords: underwater drone imagery, pixel normalization, thresholding, masking, unsharp mask filter
Procedia PDF Downloads 195938 Urban Heat Island Effects on Human Health in Birmingham and Its Mitigation
Authors: N. A. Parvin, E. B. Ferranti, L. A. Chapman, C. A. Pfrang
Abstract:
This study intends to investigate the effects of the Urban Heat Island on public health in Birmingham. Birmingham is located at the center of the West Midlands and its weather is Highly variable due to geographical factors. Residential developments, road networks and infrastructure often replace open spaces and vegetation. This transformation causes the temperature of urban areas to increase and creates an "island" of higher temperatures in the urban landscape. Extreme heat in the urban area is influencing public health in the UK as well as in the world. Birmingham is a densely built-up area with skyscrapers and congested buildings in the city center, which is a barrier to air circulation. We will investigate the city regarding heat and cold-related human mortality and other impacts. We are using primary and secondary datasets to examine the effect of population shift and land-use change on the UHI in Birmingham. We will also use freely available weather data from the Birmingham Urban Observatory and will incorporate satellite data to determine urban spatial expansion and its effect on the UHI. We have produced a temperature map based on summer datasets of 2020, which has covered 25 weather stations in Birmingham to show the differences between diurnal and nocturnal summer and annual temperature trends. Some impacts of the UHI may be beneficial, such as the lengthening of the plant growing season, but most of them are highly negative. We are looking for various effects of urban heat which is impacting human health and investigating mitigation options.Keywords: urban heat, public health, climate change
Procedia PDF Downloads 96