Search results for: imaging sensitivity measurement
4246 Indicator-Based Approach for Assessing Socio Economic Vulnerability of Dairy Farmers to Impacts of Climate Variability and Change in India
Authors: Aparna Radhakrishnan, Jancy Gupta, R. Dileepkumar
Abstract:
This paper aims at assessing the Socio Economic Vulnerability (SEV) of dairy farmers to Climate Variability and Change (CVC) in 3 states of Western Ghat region in India. For this purpose, a composite SEV index has been developed on the basis of functional relationships amongst sensitivity, exposure and adaptive capacity using 30 indicators related to dairy farming underlying the principles of Intergovernmental Panel on Climate Change and Fussel framework for nomenclature of vulnerable situation. Household level data were collected through Participatory Rural Appraisal and personal interviews of 540 dairy farmers of nine taluks, three each from a district selected from Kerala, Karnataka and Maharashtra, complemented by thirty years of gridded weather data. The data were normalized and then combined into three indices for sensitivity, exposure and adaptive capacity, which were then averaged with weights given using principal component analysis, to obtain the overall SEV index. Results indicated that the taluks of Western Ghats are vulnerable to CVC. The dairy farmers of Pulpally taluka were most vulnerable having the SEV score +1.24 and 42.66% farmers under high-level vulnerability category. Even though the taluks are geographically closer, there is wide variation in SEV components. Policies for incentivizing the ‘climate risk adaptation’ costs for small and marginal farmers and livelihood infrastructure for mitigating risks and promoting grass root level innovations are necessary to sustain dairy farming of the region.Keywords: climate change, dairy, vulnerability, livelihoods, adaptation strategies
Procedia PDF Downloads 4174245 Analysis on the Building Energy Performance of a Retrofitted Residential Building with RETScreen Expert Software
Authors: Abdulhameed Babatunde Owolabi, Benyoh Emmanuel Kigha Nsafon, Jeung-Soo Huh
Abstract:
Energy efficiency measures for residential buildings in South Korea is a national issue because most of the apartments built in the last decades were constructed without proper energy efficiency measures making the energy performance of old buildings to be very poor when compared with new buildings. However, the adoption of advanced building technologies and regulatory building codes are effective energy efficiency strategies for new construction. There is a need to retrofits the existing building using energy conservation measures (ECMs) equipment’s in order to conserve energy and reduce GHGs emissions. To achieve this, the Institute for Global Climate Change and Energy (IGCCE), Kyungpook National University (KNU), Daegu, South Korea employed RETScreen Expert software to carry out measurement and verification (M&V) analysis on an existing building in Korea by using six years gas consumption data collected from Daesung Energy Co., Ltd in order to determine the building energy performance after the introduction of ECM. Through the M&V, energy efficiency is attained, and the resident doubt was reduced. From the analysis, a total of 657 Giga Joules (GJ) of liquefied natural gas (LNG) was consumed at the rate of 0.34 GJ/day having a peak in the year 2015, which cost the occupant the sum of $10,821.Keywords: energy efficiency, measurement and verification, performance analysis, RETScreen experts
Procedia PDF Downloads 1364244 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images
Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir
Abstract:
The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.Keywords: altitude estimation, drone, image processing, trajectory planning
Procedia PDF Downloads 1094243 Analysis and Modeling of Graphene-Based Percolative Strain Sensor
Authors: Heming Yao
Abstract:
Graphene-based percolative strain gauges could find applications in many places such as touch panels, artificial skins or human motion detection because of its advantages over conventional strain gauges such as flexibility and transparency. These strain gauges rely on a novel sensing mechanism that depends on strain-induced morphology changes. Once a compression or tension strain is applied to Graphene-based percolative strain gauges, the overlap area between neighboring flakes becomes smaller or larger, which is reflected by the considerable change of resistance. Tiny strain change on graphene-based percolative strain sensor can act as an important leverage to tremendously increase resistance of strain sensor, which equipped graphene-based percolative strain gauges with higher gauge factor. Despite ongoing research in the underlying sensing mechanism and the limits of sensitivity, neither suitable understanding has been obtained of what intrinsic factors play the key role in adjust gauge factor, nor explanation on how the strain gauge sensitivity can be enhanced, which is undoubtedly considerably meaningful and provides guideline to design novel and easy-produced strain sensor with high gauge factor. We here simulated the strain process by modeling graphene flakes and its percolative networks. We constructed the 3D resistance network by simulating overlapping process of graphene flakes and interconnecting tremendous number of resistance elements which were obtained by fractionizing each piece of graphene. With strain increasing, the overlapping graphenes was dislocated on new stretched simulation graphene flake simulation film and a new simulation resistance network was formed with smaller flake number density. By solving the resistance network, we can get the resistance of simulation film under different strain. Furthermore, by simulation on possible variable parameters, such as out-of-plane resistance, in-plane resistance, flake size, we obtained the changing tendency of gauge factor with all these variable parameters. Compared with the experimental data, we verified the feasibility of our model and analysis. The increase of out-of-plane resistance of graphene flake and the initial resistance of sensor, based on flake network, both improved gauge factor of sensor, while the smaller graphene flake size gave greater gauge factor. This work can not only serve as a guideline to improve the sensitivity and applicability of graphene-based strain sensors in the future, but also provides method to find the limitation of gauge factor for strain sensor based on graphene flake. Besides, our method can be easily transferred to predict gauge factor of strain sensor based on other nano-structured transparent optical conductors, such as nanowire and carbon nanotube, or of their hybrid with graphene flakes.Keywords: graphene, gauge factor, percolative transport, strain sensor
Procedia PDF Downloads 4164242 Novel Point of Care Test for Rapid Diagnosis of COVID-19 Using Recombinant Nanobodies against SARS-CoV-2 Spike1 (S1) Protein
Authors: Manal Kamel, Sara Maher, Hanan El Baz, Faten Salah, Omar Sayyouh, Zeinab Demerdash
Abstract:
In the recent COVID 19 pandemic, experts of public health have emphasized testing, tracking infected people, and tracing their contacts as an effective strategy to reduce the spread of the virus. Development of rapid and sensitive diagnostic assays to replace reverse transcription polymerase chain reaction (RT-PCR) is mandatory..Our innovative test strip relying on the application of nanoparticles conjugated to recombinant nanobodies for SARS-COV-2 spike protein (S1) & angiotensin-converting enzyme 2 (that is responsible for the virus entry into host cells) for rapid detection of SARS-COV-2 spike protein (S1) in saliva or sputum specimens. Comparative tests with RT-PCR will be held to estimate the significant effect of using COVID 19 nanobodies for the first time in the development of lateral flow test strip. The SARS-CoV-2 S1 (3 ng of recombinant proteins) was detected by our developed LFIA in saliva specimen of COVID-19 Patients No cross-reaction was detected with Middle East respiratory syndrome coronavirus (MERS-CoV) or SARS- CoV antigens..Our developed system revealed 96 % sensitivity and 100% specificity for saliva samples compared to 89 % and 100% sensitivity and specificity for nasopharyngeal swabs. providing a reliable alternative for the painful and uncomfortable nasopharyngeal swab process and the complexes, time consuming PCR test. An increase in testing compliances to be expected.Keywords: COVID 19, diagnosis, LFIA, nanobodies, ACE2
Procedia PDF Downloads 1354241 Non-Linear Assessment of Chromatographic Lipophilicity of Selected Steroid Derivatives
Authors: Milica Karadžić, Lidija Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Kovačević, Anamarija Mandić, Aleksandar Oklješa, Andrea Nikolić, Marija Sakač, Katarina Penov Gaši
Abstract:
Using chemometric approach, the relationships between the chromatographic lipophilicity and in silico molecular descriptors for twenty-nine selected steroid derivatives were studied. The chromatographic lipophilicity was predicted using artificial neural networks (ANNs) method. The most important in silico molecular descriptors were selected applying stepwise selection (SS) paired with partial least squares (PLS) method. Molecular descriptors with satisfactory variable importance in projection (VIP) values were selected for ANN modeling. The usefulness of generated models was confirmed by detailed statistical validation. High agreement between experimental and predicted values indicated that obtained models have good quality and high predictive ability. Global sensitivity analysis (GSA) confirmed the importance of each molecular descriptor used as an input variable. High-quality networks indicate a strong non-linear relationship between chromatographic lipophilicity and used in silico molecular descriptors. Applying selected molecular descriptors and generated ANNs the good prediction of chromatographic lipophilicity of the studied steroid derivatives can be obtained. This article is based upon work from COST Actions (CM1306 and CA15222), supported by COST (European Cooperation and Science and Technology).Keywords: artificial neural networks, chemometrics, global sensitivity analysis, liquid chromatography, steroids
Procedia PDF Downloads 3444240 Monitoring Soil Moisture Dynamic in Root Zone System of Argania spinosa Using Electrical Resistivity Imaging
Authors: F. Ainlhout, S. Boutaleb, M. C. Diaz-Barradas, M. Zunzunegui
Abstract:
Argania spinosa is an endemic tree of the southwest of Morocco, occupying 828,000 Ha, distributed mainly between Mediterranean vegetation and the desert. This tree can grow in extremely arid regions in Morocco, where annual rainfall ranges between 100-300 mm where no other tree species can live. It has been designated as a UNESCO Biosphere reserve since 1998. Argania tree is of great importance in human and animal feeding of rural population as well as for oil production, it is considered as a multi-usage tree. Admine forest located in the suburbs of Agadir city, 5 km inland, was selected to conduct this work. The aim of the study was to investigate the temporal variation in root-zone moisture dynamic in response to variation in climatic conditions and vegetation water uptake, using a geophysical technique called Electrical resistivity imaging (ERI). This technique discriminates resistive woody roots, dry and moisture soil. Time-dependent measurements (from April till July) of resistivity sections were performed along the surface transect (94 m Length) at 2 m fixed electrode spacing. Transect included eight Argan trees. The interactions between the tree and soil moisture were estimated by following the tree water status variations accompanying the soil moisture deficit. For that purpose we measured midday leaf water potential and relative water content during each sampling day, and for the eight trees. The first results showed that ERI can be used to accurately quantify the spatiotemporal distribution of root-zone moisture content and woody root. The section obtained shows three different layers: middle conductive one (moistured); a moderately resistive layer corresponding to relatively dry soil (calcareous formation with intercalation of marly strata) on top, this layer is interspersed by very resistant layer corresponding to woody roots. Below the conductive layer, we find the moderately resistive layer. We note that throughout the experiment, there was a continuous decrease in soil moisture at the different layers. With the ERI, we can clearly estimate the depth of the woody roots, which does not exceed 4 meters. In previous work on the same species, analyzing the δ18O in water of xylem and in the range of possible water sources, we argued that rain is the main water source in winter and spring, but not in summer, trees are not exploiting deep water from the aquifer as the popular assessment, instead of this they are using soil water at few meter depth. The results of the present work confirm the idea that the roots of Argania spinosa are not growing very deep.Keywords: Argania spinosa, electrical resistivity imaging, root system, soil moisture
Procedia PDF Downloads 3264239 Hydrogen Purity: Developing Low-Level Sulphur Speciation Measurement Capability
Authors: Sam Bartlett, Thomas Bacquart, Arul Murugan, Abigail Morris
Abstract:
Fuel cell electric vehicles provide the potential to decarbonise road transport, create new economic opportunities, diversify national energy supply, and significantly reduce the environmental impacts of road transport. A potential issue, however, is that the catalyst used at the fuel cell cathode is susceptible to degradation by impurities, especially sulphur-containing compounds. A recent European Directive (2014/94/EU) stipulates that, from November 2017, all hydrogen provided to fuel cell vehicles in Europe must comply with the hydrogen purity specifications listed in ISO 14687-2; this includes reactive and toxic chemicals such as ammonia and total sulphur-containing compounds. This requirement poses great analytical challenges due to the instability of some of these compounds in calibration gas standards at relatively low amount fractions and the difficulty associated with undertaking measurements of groups of compounds rather than individual compounds. Without the available reference materials and analytical infrastructure, hydrogen refuelling stations will not be able to demonstrate compliance to the ISO 14687 specifications. The hydrogen purity laboratory at NPL provides world leading, accredited purity measurements to allow hydrogen refuelling stations to evidence compliance to ISO 14687. Utilising state-of-the-art methods that have been developed by NPL’s hydrogen purity laboratory, including a novel method for measuring total sulphur compounds at 4 nmol/mol and a hydrogen impurity enrichment device, we provide the capabilities necessary to achieve these goals. An overview of these capabilities will be given in this paper. As part of the EMPIR Hydrogen co-normative project ‘Metrology for sustainable hydrogen energy applications’, NPL are developing a validated analytical methodology for the measurement of speciated sulphur-containing compounds in hydrogen at low amount fractions pmol/mol to nmol/mol) to allow identification and measurement of individual sulphur-containing impurities in real samples of hydrogen (opposed to a ‘total sulphur’ measurement). This is achieved by producing a suite of stable gravimetrically-prepared primary reference gas standards containing low amount fractions of sulphur-containing compounds (hydrogen sulphide, carbonyl sulphide, carbon disulphide, 2-methyl-2-propanethiol and tetrahydrothiophene have been selected for use in this study) to be used in conjunction with novel dynamic dilution facilities to enable generation of pmol/mol to nmol/mol level gas mixtures (a dynamic method is required as compounds at these levels would be unstable in gas cylinder mixtures). Method development and optimisation are performed using gas chromatographic techniques assisted by cryo-trapping technologies and coupled with sulphur chemiluminescence detection to allow improved qualitative and quantitative analyses of sulphur-containing impurities in hydrogen. The paper will review the state-of-the art gas standard preparation techniques, including the use and testing of dynamic dilution technologies for reactive chemical components in hydrogen. Method development will also be presented highlighting the advances in the measurement of speciated sulphur compounds in hydrogen at low amount fractions.Keywords: gas chromatography, hydrogen purity, ISO 14687, sulphur chemiluminescence detector
Procedia PDF Downloads 2244238 The Current Ways of Thinking Mild Traumatic Brain Injury and Clinical Practice in a Trauma Hospital: A Pilot Study
Authors: P. Donnelly, G. Mitchell
Abstract:
Traumatic Brain Injury (TBI) is a major contributor to the global burden of disease; despite its ubiquity, there is significant variation in diagnosis, prognosis, and treatment between clinicians. This study aims to examine the spectrum of approaches that currently exist at a Level 1 Trauma Centre in Australasia by surveying Emergency Physicians and Neurosurgeons on those aspects of mTBI. A pilot survey of 17 clinicians (Neurosurgeons, Emergency Physicians, and others who manage patients with mTBI) at a Level 1 Trauma Centre in Brisbane, Australia, was conducted. The objective of this study was to examine the importance these clinicians place on various elements in their approach to the diagnosis, prognostication, and treatment of mTBI. The data were summarised, and the descriptive statistics reported. Loss of consciousness and post-traumatic amnesia were rated as the most important signs or symptoms in diagnosing mTBI (median importance of 8). MRI was the most important imaging modality in diagnosing mTBI (median importance of 7). ‘Number of the Previous TBIs’ and Intracranial Injury on Imaging’ were rated as the most important elements for prognostication (median importance of 9). Education and reassurance were rated as the most important modality for treating mTBI (median importance of 7). There was a statistically insignificant variation between the specialties as to the importance they place on each of these components. In this Australian tertiary trauma center, there appears to be variation in how clinicians approach mTBI. This study is underpowered to state whether this is between clinicians within a specialty or a trend between specialties. This variation is worthwhile in investigating as a step toward a unified approach to diagnosing, prognosticating, and treating this common pathology.Keywords: mild traumatic brain injury, adult, clinician, survey
Procedia PDF Downloads 1294237 Optimizing Nature Protection and Tourism in Urban Parks
Authors: Milena Lakicevic
Abstract:
The paper deals with the problem of optimizing management options for urban parks within different scenarios of nature protection and tourism importance. The procedure is demonstrated on a case study example of urban parks in Novi Sad (Serbia). Six management strategies for the selected area have been processed by the decision support method PROMETHEE. Two criteria used for the evaluation were nature protection and tourism and each of them has been divided into a set of indicators: for nature protection those were biodiversity and preservation of original landscape, while for tourism those were recreation potential, aesthetic values, accessibility and culture features. It was pre-assumed that each indicator in a set is equally important to a corresponding criterion. This way, the research was focused on a sensitivity analysis of criteria weights. In other words, weights of indicators were fixed and weights of criteria altered along the entire scale (from the value of 0 to the value of 1), and the assessment has been performed in two-dimensional surrounding. As a result, one could conclude which management strategy would be the most appropriate along with changing of criteria importance. The final ranking of management alternatives was followed up by investigating the mean PROMETHEE Φ values for all options considered and when altering the importance of nature protection/tourism. This type of analysis enabled detecting an alternative with a solid performance along the entire scale, i.e., regardlessly of criteria importance. That management strategy can be seen as a compromise solution when the weight of criteria is not defined. As a conclusion, it can be said that, in some cases, instead of having criteria importance fixed it is important to test the outputs depending on the different schemes of criteria weighting. The research demonstrates the state of the final decision when the decision maker can estimate criteria importance, but also in cases when the importance of criteria is not established or known.Keywords: criteria weights, PROMETHEE, sensitivity analysis, urban parks
Procedia PDF Downloads 1864236 Comparison of Radiation Dosage and Image Quality: Digital Breast Tomosynthesis vs. Full-Field Digital Mammography
Authors: Okhee Woo
Abstract:
Purpose: With increasing concern of individual radiation exposure doses, studies analyzing radiation dosage in breast imaging modalities are required. Aim of this study is to compare radiation dosage and image quality between digital breast tomosynthesis (DBT) and full-field digital mammography (FFDM). Methods and Materials: 303 patients (mean age 52.1 years) who studied DBT and FFDM were retrospectively reviewed. Radiation dosage data were obtained by radiation dosage scoring and monitoring program: Radimetrics (Bayer HealthCare, Whippany, NJ). Entrance dose and mean glandular doses in each breast were obtained in both imaging modalities. To compare the image quality of DBT with two-dimensional synthesized mammogram (2DSM) and FFDM, 5-point scoring of lesion clarity was assessed and the better modality between the two was selected. Interobserver performance was compared with kappa values and diagnostic accuracy was compared using McNemar test. The parameters of radiation dosages (entrance dose, mean glandular dose) and image quality were compared between two modalities by using paired t-test and Wilcoxon rank sum test. Results: For entrance dose and mean glandular doses for each breasts, DBT had lower values compared with FFDM (p-value < 0.0001). Diagnostic accuracy did not have statistical difference, but lesion clarity score was higher in DBT with 2DSM and DBT was chosen as a better modality compared with FFDM. Conclusion: DBT showed lower radiation entrance dose and also lower mean glandular doses to both breasts compared with FFDM. Also, DBT with 2DSM had better image quality than FFDM with similar diagnostic accuracy, suggesting that DBT may have a potential to be performed as an alternative to FFDM.Keywords: radiation dose, DBT, digital mammography, image quality
Procedia PDF Downloads 3484235 Development and Validation of High-Performance Liquid Chromatography Method for the Determination and Pharmacokinetic Study of Linagliptin in Rat Plasma
Authors: Hoda Mahgoub, Abeer Hanafy
Abstract:
Linagliptin (LNG) belongs to dipeptidyl-peptidase-4 (DPP-4) inhibitor class. DPP-4 inhibitors represent a new therapeutic approach for the treatment of type 2 diabetes in adults. The aim of this work was to develop and validate an accurate and reproducible HPLC method for the determination of LNG with high sensitivity in rat plasma. The method involved separation of both LNG and pindolol (internal standard) at ambient temperature on a Zorbax Eclipse XDB C18 column and a mobile phase composed of 75% methanol: 25% formic acid 0.1% pH 4.1 at a flow rate of 1.0 mL.min-1. UV detection was performed at 254nm. The method was validated in compliance with ICH guidelines and found to be linear in the range of 5–1000ng.mL-1. The limit of quantification (LOQ) was found to be 5ng.mL-1 based on 100µL of plasma. The variations for intra- and inter-assay precision were less than 10%, and the accuracy values were ranged between 93.3% and 102.5%. The extraction recovery (R%) was more than 83%. The method involved a single extraction step of a very small plasma volume (100µL). The assay was successfully applied to an in-vivo pharmacokinetic study of LNG in rats that were administered a single oral dose of 10mg.kg-1 LNG. The maximum concentration (Cmax) was found to be 927.5 ± 23.9ng.mL-1. The area under the plasma concentration-time curve (AUC0-72) was 18285.02 ± 605.76h.ng.mL-1. In conclusion, the good accuracy and low LOQ of the bioanalytical HPLC method were suitable for monitoring the full pharmacokinetic profile of LNG in rats. The main advantages of the method were the sensitivity, small sample volume, single-step extraction procedure and the short time of analysis.Keywords: HPLC, linagliptin, pharmacokinetic study, rat plasma
Procedia PDF Downloads 2404234 Understanding Chromosome Movement in Starfish Oocytes
Authors: Bryony Davies
Abstract:
Many cell and tissue culture practices ignore the effects of gravity on cell biology, and little is known about how cell components may move in response to gravitational forces. Starfish oocytes provide an excellent model for interrogating the movement of cell components due to their unusually large size, ease of handling, and high transparency. Chromosomes from starfish oocytes can be visualised by microinjection of the histone-H2B-mCherry plasmid into the oocytes. The movement of the chromosomes can then be tracked by live-cell fluorescence microscopy. The results from experiments using these methods suggest that there is a replicable downward movement of centrally located chromosomes at a median velocity of 0.39 μm/min. Chromosomes nearer the nuclear boundary showed more restricted movement. Chromosome density and shape could also be altered by microinjection of restriction enzymes, primarily Alu1, before imaging. This was found to alter the speed of chromosome movement, with chromosomes from Alu1-injected nuclei showing a median downward velocity of 0.60 μm/min. Overall, these results suggest that there is a non-negligible movement of chromosomes in response to gravitational forces and that this movement can be altered by enzyme activity. Future directions based on these results could interrogate if this observed downward movement extends to other cell components and to other cell types. Additionally, it may be important to understand whether gravitational orientation and vertical positioning of cell components alter cell behaviour. The findings here may have implications for current cell culture practices, which do not replicate cell orientations or external forces experienced in vivo. It is possible that a failure to account for gravitational forces in 2D cell culture alters experimental results and the accuracy of conclusions drawn from them. Understanding possible behavioural changes in cells due to the effects of gravity would therefore be beneficial.Keywords: starfish, oocytes, live-cell imaging, microinjection, chromosome dynamics
Procedia PDF Downloads 1024233 Development of Pothole Management Method Using Automated Equipment with Multi-Beam Sensor
Authors: Sungho Kim, Jaechoul Shin, Yujin Baek, Nakseok Kim, Kyungnam Kim, Shinhaeng Jo
Abstract:
The climate change and increase in heavy traffic have been accelerating damages that cause the problems such as pothole on asphalt pavement. Pothole causes traffic accidents, vehicle damages, road casualties and traffic congestion. A quick and efficient maintenance method is needed because pothole is caused by stripping and accelerates pavement distress. In this study, we propose a rapid and systematic pothole management by developing a pothole automated repairing equipment including a volume measurement system of pothole. Three kinds of cold mix asphalt mixture were investigated to select repair materials. The materials were evaluated for satisfaction with quality standard and applicability to automated equipment. The volume measurement system of potholes was composed of multi-sensor that are combined with laser sensor and ultrasonic sensor and installed in front and side of the automated repair equipment. An algorithm was proposed to calculate the amount of repair material according to the measured pothole volume, and the system for releasing the correct amount of material was developed. Field test results showed that the loss of repair material amount could be reduced from approximately 20% to 6% per one point of pothole. Pothole rapid automated repair equipment will contribute to improvement on quality and efficient and economical maintenance by not only reducing materials and resources but also calculating appropriate materials. Through field application, it is possible to improve the accuracy of pothole volume measurement, to correct the calculation of material amount, and to manage the pothole data of roads, thereby enabling more efficient pavement maintenance management. Acknowledgment: The author would like to thank the MOLIT(Ministry of Land, Infrastructure, and Transport). This work was carried out through the project funded by the MOLIT. The project name is 'development of 20mm grade for road surface detecting roadway condition and rapid detection automation system for removal of pothole'.Keywords: automated equipment, management, multi-beam sensor, pothole
Procedia PDF Downloads 2224232 Did Chilling Injury of Rice Decrease under Climate Warming? A Case Study in Northeast China
Authors: Fengmei Yao, Pengcheng Qin, Jiahua Zhang, Min Liu
Abstract:
Global warming is expected to reduce the risk of low temperature stress in rice grown in temperate regions, but this impact has not been well verified by empirical studies directly on chilling injury in rice. In this study, a case study in Northeast China was presented to investigate whether the frequencies of chilling injury declined as a result of climate change, in comprehensive consideration of the potential effects from autonomous adaptation of rice production in response to climate change, such as shifts in cultivation timing and rice cultivars. It was found that frequency of total chilling injury (either delayed-growth type or sterile-type in a year) decreased but only to a limit extent in the context of climate change, mainly owing to a pronounced decrease in frequency of the delayed-growth chilling injury, while there was no overwhelming decreasing tendency for frequency of the sterile-type chilling injury, rather, it even increased considerably for some regions. If changes in cultivars had not occurred, risks of chilling injury of both types would have been much lower, specifically for the sterile-type chilling injury for avoiding deterioration in chilling sensitivity of rice cultivars. In addition, earlier planting helped lower the risk of chilling injury but still can not overweight the effects of introduction of new cultivars. It was concluded that risks of chilling injury in rice would not necessarily decrease as a result of climate change, considering the accompanying adaptation process may increase the chilling sensitivity of rice production system in a warmer climate conditions, and thus precautions should still be taken.Keywords: chilling injury, rice, CERES-rice model, climate warming, North east China
Procedia PDF Downloads 3324231 The Effect of TQM Implementation on Bahrain Industrial Performance
Authors: Bader Al-Mannai, Saad Sulieman, Yaser Al-Alawi
Abstract:
Research studies worldwide undoubtedly demonstrated that the implementation of Total Quality Management (TQM) program can improve organizations competitive abilities and provide strategic quality advances. However, limited empirical studies and research are directed to measure the effectiveness of TQM implementation on the industrial and manufacturing organizations performance. Accordingly, this paper is aimed at discussing “the degree of TQM implementation in Bahrain industries and its effect on their performance”. The paper will present the measurement indicators and success factors that were used to assess the degree of TQM implementation in Bahrain industry, and the main performance indicators that were affected by TQM implementation. The adopted research methodology in this study was a survey that was based on self-completion questionnaire. The sample population represented the industrial and manufacturing organizations in Bahrain. The study led to the identification of the operational and strategic measurement indicators and success factors that assist organizations in realizing successful TQM implementation and performance improvement. Furthermore, the research analysis confirmed a positive and significant relationship between the examined performance indicators in Bahrain industry and TQM implementation. In conclusion the investigation of the relationship revealed that the implementation of TQM program has resulted into remarkable improvements on workforce, sales performance, and quality performance indicators in Bahrain industry.Keywords: performance indicators, success factors, TQM implementation, Bahrain
Procedia PDF Downloads 5504230 A Study on the Relationship between Nonverbal Communication, Perceived Expertise, Service Quality, and Trust: Focusing on Cosmetic Stores in Korea
Authors: Park Sung-Min, Chan Kwon Park, Kim Chae-Bogk
Abstract:
This study aims to analyze the relationship between nonverbal communication, perceived expertise, service quality, and trust. The study was conducted with clients using cosmetic stores in the Daegu area of Korea. Based on the prior study, the measurement questions are correctly amended and organized in this study. The exploration factor analysis was performed using SPSS 22 for the configured measurement questions. And PLS 2.0 was used to perform a confirmatory factor analysis and path analysis. As a result of the analysis, nonverbal communication has been categorized as physical appearance, kinesics, vocal behavior and proxemics. It has been shown that all of the factors in nonverbal communication have a (+) significant effect on perceived expertise. The degree of impact has been analyzed to influence the perceived expertise in the order in which physical appearance, vocal behavior, kinesics and proxemics. The perceived expertise was analyzed to have a (+) significant effect on the service quality. It has been shown that the service quality has a (+) significant effect on trust.Keywords: nonverbal communication, perceived expertise, service quality, trust
Procedia PDF Downloads 2834229 Evaluation of the Diagnostic Potential of IL-2 after Specific Antigen Stimulation with PE35 (Rv3872) and PPE68 (Rv3873) for the Discrimination of Active and Latent Tuberculosis
Authors: Shima Mahmoudi, Babak Pourakbari, Setareh Mamishi, Mostafa Teymuri, Majid Marjani
Abstract:
Although cytokine analysis has greatly contributed to the understanding of tuberculosis (TB) pathogenesis, data on cytokine profiles that might distinguish progression from latency of TB infection are scarce. Since PE/PPE proteins are known to induce strong humoral and cellular immune responses, the aim of this study was to evaluate the diagnostic potential of interleukin-2 (IL-2) as biomarker after specific antigen stimulation with PE35 and PPE68 for the discrimination of active and latent tuberculosis infection (LTBI). The production of IL-2 was measured in the antigen-stimulated whole-blood supernatants following stimulation with recombinant PE35 and PPE68. All the patients with active TB and LTBI had positive QuantiFERON-TB Gold in Tube test. The level of IL-2 following stimulation with recombinant PE35 and PPE68 were significantly higher in LTBI group than in patients with active TB infection or control group. The discrimination performance (assessed by the area under ROC curve) for IL-2 following stimulation with recombinant PE35 and PPE68 between LTBI and patients with active TB were 0.837 (95%CI: 0.72-0.97) and 0.75 (95%CI: 0.63-0.89), respectively. Applying the 12.4 pg/mL cut-off for IL-2 induced by PE35 in the present study population resulted in sensitivity of 78%, specificity of 78%, PPV of 78% and NPV of 100%. In addition, a sensitivity of 81%, specificity of 70%, PPV of 67% and 87% of NPV was reported based on the 4.4 pg/mL cut-off for IL-2 induced by PPE68. In conclusion, peptides of the antigen PE35 and PPE68, absent from commonly used BCG strains, stimulated strong IL-2- positive T cell responses in patients with LTBI. This study confirms IL-2 induced by PE35 and PPE68 as a sensitive and specific biomarker and highlights IL-2 as new promising adjunct markers for discriminating of LTBI and Active TB infection.Keywords: IL-2, PE35, PPE68, tuberculosis
Procedia PDF Downloads 4084228 Visibility Measurements Using a Novel Open-Path Optical Extinction Analyzer
Authors: Nabil Saad, David Morgan, Manish Gupta
Abstract:
Visibility has become a key component of air quality and is regulated in many areas by environmental laws such as the EPA Clean Air Act and Regional Haze Rule. Typically, visibility is calculated by estimating the optical absorption and scattering of both gases and aerosols. A major component of the aerosols’ climatic effect is due to their scattering and absorption of solar radiation, which are governed by their optical and physical properties. However, the accurate assessment of this effect on global warming, climate change, and air quality is made difficult due to uncertainties in the calculation of single scattering albedo (SSA). Experimental complications arise in the determination of the single scattering albedo of an aerosol particle since it requires the simultaneous measurement of both scattering and extinction. In fact, aerosol optical absorption, in particular, is a difficult measurement to perform, and it’s often associated with large uncertainties when using filter methods or difference methods. In this presentation, we demonstrate the use of a new open-path Optical Extinction Analyzer (OEA) in conjunction with a nephelometer and two particle sizers, emphasizing the benefits that co-employment of the OEA offers to derive the complex refractive index of aerosols and their single scattering albedo parameter. Various use cases, data reproducibility, and instrument calibration will also be presented to highlight the value proposition of this novel Open-Path OEA.Keywords: aerosols, extinction, visibility, albedo
Procedia PDF Downloads 884227 Constructing Optimized Criteria of Objective Assessment Indicators among Elderly Frailty
Authors: Shu-Ching Chiu, Shu-Fang Chang
Abstract:
The World Health Organization (WHO) has been actively developing intervention programs to deal with geriatric frailty. In its White Paper on Healthcare Policy 2020, the Department of Health, Bureau of Health Promotion proposed that active aging and the prevention of disability are essential for elderly people to maintain good health. The paper recommended five main policies relevant to this objective, one of which is the prevention of frailty and disability. Scholars have proposed a number of different criteria to diagnose and assess frailty; no consistent or normative standard of measurement is currently available. In addition, many methods of assessment are recursive, which can easily result in recall bias. Due to the relationship between frailty and physical fitness with regard to co-morbidity, it is important that academics optimize the criteria used to assess frailty by objectively evaluating the physical fitness of senior citizens. This study used a review of the literature to identify fitness indicators suitable for measuring frailty in the elderly. This study recommends that measurement criteria be integrated to produce an optimized predictive value for frailty score. Healthcare professionals could use this data to detect frailty at an early stage and provide appropriate care to prevent further debilitation and increase longevity.Keywords: frailty, aging, physical fitness, optimized criteria, healthcare
Procedia PDF Downloads 3544226 Multimodal Integration of EEG, fMRI and Positron Emission Tomography Data Using Principal Component Analysis for Prognosis in Coma Patients
Authors: Denis Jordan, Daniel Golkowski, Mathias Lukas, Katharina Merz, Caroline Mlynarcik, Max Maurer, Valentin Riedl, Stefan Foerster, Eberhard F. Kochs, Andreas Bender, Ruediger Ilg
Abstract:
Introduction: So far, clinical assessments that rely on behavioral responses to differentiate coma states or even predict outcome in coma patients are unreliable, e.g. because of some patients’ motor disabilities. The present study was aimed to provide prognosis in coma patients using markers from electroencephalogram (EEG), blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI) and [18F]-fluorodeoxyglucose (FDG) positron emission tomography (PET). Unsuperwised principal component analysis (PCA) was used for multimodal integration of markers. Methods: Approved by the local ethics committee of the Technical University of Munich (Germany) 20 patients (aged 18-89) with severe brain damage were acquired through intensive care units at the Klinikum rechts der Isar in Munich and at the Therapiezentrum Burgau (Germany). At the day of EEG/fMRI/PET measurement (date I) patients (<3.5 month in coma) were grouped in the minimal conscious state (MCS) or vegetative state (VS) on the basis of their clinical presentation (coma recovery scale-revised, CRS-R). Follow-up assessment (date II) was also based on CRS-R in a period of 8 to 24 month after date I. At date I, 63 channel EEG (Brain Products, Gilching, Germany) was recorded outside the scanner, and subsequently simultaneous FDG-PET/fMRI was acquired on an integrated Siemens Biograph mMR 3T scanner (Siemens Healthineers, Erlangen Germany). Power spectral densities, permutation entropy (PE) and symbolic transfer entropy (STE) were calculated in/between frontal, temporal, parietal and occipital EEG channels. PE and STE are based on symbolic time series analysis and were already introduced as robust markers separating wakefulness from unconsciousness in EEG during general anesthesia. While PE quantifies the regularity structure of the neighboring order of signal values (a surrogate of cortical information processing), STE reflects information transfer between two signals (a surrogate of directed connectivity in cortical networks). fMRI was carried out using SPM12 (Wellcome Trust Center for Neuroimaging, University of London, UK). Functional images were realigned, segmented, normalized and smoothed. PET was acquired for 45 minutes in list-mode. For absolute quantification of brain’s glucose consumption rate in FDG-PET, kinetic modelling was performed with Patlak’s plot method. BOLD signal intensity in fMRI and glucose uptake in PET was calculated in 8 distinct cortical areas. PCA was performed over all markers from EEG/fMRI/PET. Prognosis (persistent VS and deceased patients vs. recovery to MCS/awake from date I to date II) was evaluated using the area under the curve (AUC) including bootstrap confidence intervals (CI, *: p<0.05). Results: Prognosis was reliably indicated by the first component of PCA (AUC=0.99*, CI=0.92-1.00) showing a higher AUC when compared to the best single markers (EEG: AUC<0.96*, fMRI: AUC<0.86*, PET: AUC<0.60). CRS-R did not show prediction (AUC=0.51, CI=0.29-0.78). Conclusion: In a multimodal analysis of EEG/fMRI/PET in coma patients, PCA lead to a reliable prognosis. The impact of this result is evident, as clinical estimates of prognosis are inapt at time and could be supported by quantitative biomarkers from EEG, fMRI and PET. Due to the small sample size, further investigations are required, in particular allowing superwised learning instead of the basic approach of unsuperwised PCA.Keywords: coma states and prognosis, electroencephalogram, entropy, functional magnetic resonance imaging, machine learning, positron emission tomography, principal component analysis
Procedia PDF Downloads 3374225 A Study on the Relationship between Transaction Fairness, Social Capital, Supply Chain Integration and Sustainability: Focusing on Manufacturing Companies of South Korea
Authors: Sung-Min Park, Chan Kwon Park, Chae-Bogk Kim
Abstract:
The purpose of this study is to analyze the relationship between transaction fairness, social capital, supply chain integration and sustainability. Based on the previous studies, measurement items were determined by using SPSS 22 and exploratory factor analysis was performed, and again, using AMOS 21 for confirmatory factor analysis and path analysis was performed by using study items that satisfy reliability, validity, and appropriateness of measurement model. It has shown that transaction fairness has a (+) significant effect on social capital, social capital on supply chain integration, supply chain integration on economic sustainability and social sustainability, and has a (+), but not significant effect on environmental sustainability. It has shown that supply chain integration has been proven to play a role as a parameter between social capital and economic and social sustainability, but not as a parameter between environmental sustainability. Through this study, it is suggested that clearly examining the relationship between fairness of trade, social capital, supply chain integration and sustainability, maintaining fairness of the transaction make formation of social capital, and further integration of supply chain, and achieve sustainability of entire supply chain.Keywords: transaction fairness, social capital, supply chain integration, sustainability
Procedia PDF Downloads 4394224 Evaluation of a Data Fusion Algorithm for Detecting and Locating a Radioactive Source through Monte Carlo N-Particle Code Simulation and Experimental Measurement
Authors: Hadi Ardiny, Amir Mohammad Beigzadeh
Abstract:
Through the utilization of a combination of various sensors and data fusion methods, the detection of potential nuclear threats can be significantly enhanced by extracting more information from different data. In this research, an experimental and modeling approach was employed to track a radioactive source by combining a surveillance camera and a radiation detector (NaI). To run this experiment, three mobile robots were utilized, with one of them equipped with a radioactive source. An algorithm was developed in identifying the contaminated robot through correlation between camera images and camera data. The computer vision method extracts the movements of all robots in the XY plane coordinate system, and the detector system records the gamma-ray count. The position of the robots and the corresponding count of the moving source were modeled using the MCNPX simulation code while considering the experimental geometry. The results demonstrated a high level of accuracy in finding and locating the target in both the simulation model and experimental measurement. The modeling techniques prove to be valuable in designing different scenarios and intelligent systems before initiating any experiments.Keywords: nuclear threats, radiation detector, MCNPX simulation, modeling techniques, intelligent systems
Procedia PDF Downloads 1224223 Moving toward Language Acquisition: A Case Study Adapting and Applying Laban Movement Analysis in the International English as an Additional Language Classroom
Authors: Andra Yount
Abstract:
The purpose of this research project is to understand how focusing on movement can help English language learners acquire better reading, writing, and speaking skills. More specifically, this case study tests how Laban movement analysis, a tool often used in dance and physical education classes, contributes to advanced-level high school students’ English language acquisition at an international Swiss boarding school. This article shares theoretical bases for and findings from a teaching experiment in which LMA categories (body, effort, space, and shape) were adapted and introduced to students to encourage basic language acquisition and also cultural awareness and sensitivity. As part of the participatory action research process, data collection included pseudonym-protected questionnaires and written/video-taped responses to LMA language and task prompts. Responses from 43 participants were evaluated to determine the efficacy of using this system. Participants (ages 16-19) were enrolled in advanced English as an Additional Language (EAL) courses at a private, co-educational Swiss international boarding school. Final data analysis revealed that drawing attention to movement using LMA language as a stimulus creates better self-awareness and understanding/retention of key literary concepts and vocabulary but does not necessarily contribute to greater cultural sensitivity or eliminate the use of problematic (sexist, racist, or classist) language. Possibilities for future exploration and development are also explored.Keywords: dance, English, Laban, pedagogy
Procedia PDF Downloads 1514222 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence
Authors: Gus Calderon, Richard McCreight, Tammy Schwartz
Abstract:
Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.
Procedia PDF Downloads 1064221 Median Versus Ulnar Medial Thenar Motor Recording in Diagnosis Of Carpal Tunnel Syndrome
Authors: Emmanuel Kamal Aziz Saba
Abstract:
Aim of the work: This study proposed to assess the role of the median versus ulnar medial thenar motor (MTM) recording in supporting the diagnosis of carpal tunnel syndrome (CTS). Patients and methods: The present study included 130 hands (70 CTS and 60 controls). Clinical examination was done for all patients. The following tests were done (using surface electrodes recording) for patients and control: (1) sensory nerve conduction studies: median nerve, ulnar nerve and median versus ulnar digit four sensory study; (2) motor nerve conduction studies: median nerve, ulnar nerve, median (second lumbrical) versus ulnar (interosseous) (2-LINT) motor study and median versus ulnar (MTM) study. Results: The tests with higher sensitivity in diagnosing CTS were median versus ulnar (2-LINT) motor latency difference (87.1%), median versus ulnar (MTM) motor latency difference (80%) and median versus ulnar digit four sensory latency differences (91.4%). There was no statistically significant difference between median versus ulnar (MTM) motor latency difference with both median versus ulnar (2-LINT) motor latency difference and median versus ulnar digit four sensory latency difference (P > 0.05) as regards the confirmation of CTS. Conclusions: Median versus ulnar (MTM) motor latency difference has high sensitivity and specificity for the diagnosis of CTS as for both median versus ulnar (2-LINT) motor latency difference and median versus ulnar digit four sensory latency differences. It can be considered a useful neurophysiological test to be used in combination with another median versus ulnar comparative tests for confirming the diagnosis of CTS beside other well-known electrophysiological tests.Keywords: carpal tunnel syndrome, medial thenar motor, median nerve, ulnar nerve
Procedia PDF Downloads 4394220 The Effect of Training Program by Using Especial Strength on the Performance Skills of Hockey Players
Authors: Wesam El Bana
Abstract:
The current research aimed at designing a training program for improving specific muscular strength through using the especial strength and identifying its effects on the performance level skills of hockey players. The researcher used the quasi-experimental approach (two – group design) with pre- and post-measurements. Sample: (n= 35) was purposefully chosen from sharkia sports club. Five hockey player were excluded due to their non-punctuality. The rest were divided into two equal groups (experimental and control). The researcher concluded the following: The traditional training program had a positive effect on improving the physical variables under investigation as it led to increasing the improvement percentages of the physical variables and the performance level skills of the control group between the pre- and post-measurement. The recommended training program had a positive effect on improving the physical variables under investigation as it led to increasing the improvement percentages of the physical variable and the performance level skills of the experimental group between the pre- and post-measurements. Exercises using the especial strength training had a positive effect on the post-measurement of the experimental group.Keywords: hockey, especial strength, performance skills
Procedia PDF Downloads 2404219 Organizational Resilience in the Perspective of Supply Chain Risk Management: A Scholarly Network Analysis
Authors: William Ho, Agus Wicaksana
Abstract:
Anecdotal evidence in the last decade shows that the occurrence of disruptive events and uncertainties in the supply chain is increasing. The coupling of these events with the nature of an increasingly complex and interdependent business environment leads to devastating impacts that quickly propagate within and across organizations. For example, the recent COVID-19 pandemic increased the global supply chain disruption frequency by at least 20% in 2020 and is projected to have an accumulative cost of $13.8 trillion by 2024. This crisis raises attention to organizational resilience to weather business uncertainty. However, the concept has been criticized for being vague and lacking a consistent definition, thus reducing the significance of the concept for practice and research. This study is intended to solve that issue by providing a comprehensive review of the conceptualization, measurement, and antecedents of operational resilience that have been discussed in the supply chain risk management literature (SCRM). We performed a Scholarly Network Analysis, combining citation-based and text-based approaches, on 252 articles published from 2000 to 2021 in top-tier journals based on three parameters: AJG ranking and ABS ranking, UT Dallas and FT50 list, and editorial board review. We utilized a hybrid scholarly network analysis by combining citation-based and text-based approaches to understand the conceptualization, measurement, and antecedents of operational resilience in the SCRM literature. Specifically, we employed a Bibliographic Coupling Analysis in the research cluster formation stage and a Co-words Analysis in the research cluster interpretation and analysis stage. Our analysis reveals three major research clusters of resilience research in the SCRM literature, namely (1) supply chain network design and optimization, (2) organizational capabilities, and (3) digital technologies. We portray the research process in the last two decades in terms of the exemplar studies, problems studied, commonly used approaches and theories, and solutions provided in each cluster. We then provide a conceptual framework on the conceptualization and antecedents of resilience based on studies in these clusters and highlight potential areas that need to be studied further. Finally, we leverage the concept of abnormal operating performance to propose a new measurement strategy for resilience. This measurement overcomes the limitation of most current measurements that are event-dependent and focus on the resistance or recovery stage - without capturing the growth stage. In conclusion, this study provides a robust literature review through a scholarly network analysis that increases the completeness and accuracy of research cluster identification and analysis to understand conceptualization, antecedents, and measurement of resilience. It also enables us to perform a comprehensive review of resilience research in SCRM literature by including research articles published during the pandemic and connects this development with a plethora of articles published in the last two decades. From the managerial perspective, this study provides practitioners with clarity on the conceptualization and critical success factors of firm resilience from the SCRM perspective.Keywords: supply chain risk management, organizational resilience, scholarly network analysis, systematic literature review
Procedia PDF Downloads 734218 Estimations of Spectral Dependence of Tropospheric Aerosol Single Scattering Albedo in Sukhothai, Thailand
Authors: Siriluk Ruangrungrote
Abstract:
Analyses of available data from MFR-7 measurement were performed and discussed on the study of tropospheric aerosol and its consequence in Thailand. Since, ASSA (w) is one of the most important parameters for a determination of aerosol effect on radioactive forcing. Here the estimation of w was directly determined in terms of the ratio of aerosol scattering optical depth to aerosol extinction optical depth (ωscat/ωext) without any utilization of aerosol computer code models. This is of benefit for providing the elimination of uncertainty causing by the modeling assumptions and the estimation of actual aerosol input data. Diurnal w of 5 cloudless-days in winter and early summer at 5 distinct wavelengths of 415, 500, 615, 673 and 870 nm with the consideration of Rayleigh scattering and atmospheric column NO2 and Ozone contents were investigated, respectively. Besides, the tendency of spectral dependence of ω representing two seasons was observed. The characteristic of spectral results reveals that during wintertime the atmosphere of the inland rural vicinity for the period of measurement possibly dominated with a lesser amount of soil dust aerosols loading than one in early summer. Hence, the major aerosol loading particularly in summer was subject to a mixture of both soil dust and biomass burning aerosols.Keywords: aerosol scattering optical depth, aerosol extinction optical depth, biomass burning aerosol, soil dust aerosol
Procedia PDF Downloads 4054217 A Methodology Based on Image Processing and Deep Learning for Automatic Characterization of Graphene Oxide
Authors: Rafael do Amaral Teodoro, Leandro Augusto da Silva
Abstract:
Originated from graphite, graphene is a two-dimensional (2D) material that promises to revolutionize technology in many different areas, such as energy, telecommunications, civil construction, aviation, textile, and medicine. This is possible because its structure, formed by carbon bonds, provides desirable optical, thermal, and mechanical characteristics that are interesting to multiple areas of the market. Thus, several research and development centers are studying different manufacturing methods and material applications of graphene, which are often compromised by the scarcity of more agile and accurate methodologies to characterize the material – that is to determine its composition, shape, size, and the number of layers and crystals. To engage in this search, this study proposes a computational methodology that applies deep learning to identify graphene oxide crystals in order to characterize samples by crystal sizes. To achieve this, a fully convolutional neural network called U-net has been trained to segment SEM graphene oxide images. The segmentation generated by the U-net is fine-tuned with a standard deviation technique by classes, which allows crystals to be distinguished with different labels through an object delimitation algorithm. As a next step, the characteristics of the position, area, perimeter, and lateral measures of each detected crystal are extracted from the images. This information generates a database with the dimensions of the crystals that compose the samples. Finally, graphs are automatically created showing the frequency distributions by area size and perimeter of the crystals. This methodological process resulted in a high capacity of segmentation of graphene oxide crystals, presenting accuracy and F-score equal to 95% and 94%, respectively, over the test set. Such performance demonstrates a high generalization capacity of the method in crystal segmentation, since its performance considers significant changes in image extraction quality. The measurement of non-overlapping crystals presented an average error of 6% for the different measurement metrics, thus suggesting that the model provides a high-performance measurement for non-overlapping segmentations. For overlapping crystals, however, a limitation of the model was identified. To overcome this limitation, it is important to ensure that the samples to be analyzed are properly prepared. This will minimize crystal overlap in the SEM image acquisition and guarantee a lower error in the measurements without greater efforts for data handling. All in all, the method developed is a time optimizer with a high measurement value, considering that it is capable of measuring hundreds of graphene oxide crystals in seconds, saving weeks of manual work.Keywords: characterization, graphene oxide, nanomaterials, U-net, deep learning
Procedia PDF Downloads 158