Search results for: tanh method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18541

Search results for: tanh method

13351 Microstructure Evolution and Modelling of Shear Forming

Authors: Karla D. Vazquez-Valdez, Bradley P. Wynne

Abstract:

In the last decades manufacturing needs have been changing, leading to the study of manufacturing methods that were underdeveloped, such as incremental forming processes like shear forming. These processes use rotating tools in constant local contact with the workpiece, which is often also rotating, to generate shape. This means much lower loads to forge large parts and no need for expensive special tooling. Potential has already been established by demonstrating manufacture of high-value products, e.g., turbine and satellite parts, with high dimensional accuracy from difficult to manufacture materials. Thus, huge opportunities exist for these processes to replace the current method of manufacture for a range of high value components, e.g., eliminating lengthy machining, reducing material waste and process times; or the manufacture of a complicated shape without the development of expensive tooling. However, little is known about the exact deformation conditions during processing and why certain materials are better than others for shear forming, leading to a lot of trial and error before production. Three alloys were used for this study: Ti-54M, Jethete M154, and IN718. General Microscopy and Electron Backscatter Diffraction (EBSD) were used to measure strains and orientation maps during shear forming. A Design of Experiments (DOE) analysis was also made in order to understand the impact of process parameters in the properties of the final workpieces. Such information was the key to develop a reliable Finite Element Method (FEM) model that closely resembles the deformation paths of this process. Finally, the potential of these three materials to be shear spun was studied using the FEM model and their Forming Limit Diagram (FLD) which led to the development of a rough methodology for testing the shear spinnability of various metals.

Keywords: shear forming, damage, principal strains, forming limit diagram

Procedia PDF Downloads 142
13350 Nanofluid-Based Emulsion Liquid Membrane for Selective Extraction and Separation of Dysprosium

Authors: Maliheh Raji, Hossein Abolghasemi, Jaber Safdari, Ali Kargari

Abstract:

Dysprosium is a rare earth element which is essential for many growing high-technology applications. Dysprosium along with neodymium plays a significant role in different applications such as metal halide lamps, permanent magnets, and nuclear reactor control rods preparation. The purification and separation of rare earth elements are challenging because of their similar chemical and physical properties. Among the various methods, membrane processes provide many advantages over the conventional separation processes such as ion exchange and solvent extraction. In this work, selective extraction and separation of dysprosium from aqueous solutions containing an equimolar mixture of dysprosium and neodymium by emulsion liquid membrane (ELM) was investigated. The organic membrane phase of the ELM was a nanofluid consisting of multiwalled carbon nanotubes (MWCNT), Span80 as surfactant, Cyanex 272 as carrier, kerosene as base fluid, and nitric acid solution as internal aqueous phase. Factors affecting separation of dysprosium such as carrier concentration, MWCNT concentration, feed phase pH and stripping phase concentration were analyzed using Taguchi method. Optimal experimental condition was obtained using analysis of variance (ANOVA) after 10 min extraction. Based on the results, using MWCNT nanofluid in ELM process leads to increase the extraction due to higher stability of membrane and mass transfer enhancement and separation factor of 6 for dysprosium over neodymium can be achieved under the optimum conditions. Additionally, demulsification process was successfully performed and the membrane phase reused effectively in the optimum condition.

Keywords: emulsion liquid membrane, MWCNT nanofluid, separation, Taguchi method

Procedia PDF Downloads 267
13349 Development of an Appropriate Method for the Determination of Multiple Mycotoxins in Pork Processing Products by UHPLC-TCFLD

Authors: Jason Gica, Yi-Hsieng Samuel Wu, Deng-Jye Yang, Yi-Chen Chen

Abstract:

Mycotoxins, harmful secondary metabolites produced by certain fungi species, pose significant risks to animals and humans worldwide. Their stable properties lead to contamination during grain harvesting, transportation, and storage, as well as in processed food products. The prevalence of mycotoxin contamination has attracted significant attention due to its adverse impact on food safety and global trade. The secondary contamination pathway from animal products has been identified as an important route of exposure, posing health risks for livestock and humans consuming contaminated products. Pork, one of the highly consumed meat products in Taiwan according to the National Food Consumption Database, plays a critical role in the nation's diet and economy. Given its substantial consumption, pork processing products are a significant component of the food supply chain and a potential source of mycotoxin contamination. This study is paramount for formulating effective regulations and strategies to mitigate mycotoxin-related risks in the food supply chain. By establishing a reliable analytical method, this research contributes to safeguarding public health and enhancing the quality of pork processing products. The findings will serve as valuable guidance for policymakers, food industries, and consumers to ensure a safer food supply chain in the face of emerging mycotoxin challenges. An innovative and efficient analytical approach is proposed using Ultra-High Performance Liquid Chromatography coupled with Temperature Control Fluorescence Detector Light (UHPLC-TCFLD) to determine multiple mycotoxins in pork meat samples due to its exceptional capacity to detect multiple mycotoxins at the lowest levels of concentration, making it highly sensitive and reliable for comprehensive mycotoxin analysis. Additionally, its ability to simultaneously detect multiple mycotoxins in a single run significantly reduces the time and resources required for analysis, making it a cost-effective solution for monitoring mycotoxin contamination in pork processing products. The research aims to optimize the efficient mycotoxin QuEChERs extraction method and rigorously validate its accuracy and precision. The results will provide crucial insights into mycotoxin levels in pork processing products.

Keywords: multiple-mycotoxin analysis, pork processing products, QuEChERs, UHPLC-TCFLD, validation

Procedia PDF Downloads 41
13348 Correlation Results Based on Magnetic Susceptibility Measurements by in-situ and Ex-Situ Measurements as Indicators of Environmental Changes Due to the Fertilizer Industry

Authors: Nurin Amalina Widityani, Adinda Syifa Azhari, Twin Aji Kusumagiani, Eleonora Agustine

Abstract:

Fertilizer industry activities contribute to environmental changes. Changes to the environment became one of a few problems in this era of globalization. Parameters that can be seen as criteria to identify changes in the environment can be seen from the aspects of physics, chemistry, and biology. One aspect that can be assessed quickly and efficiently to describe environmental change is the aspect of physics, one of which is the value of magnetic susceptibility (χ). The rock magnetism method can be used as a proxy indicator of environmental changes, seen from the value of magnetic susceptibility. The rock magnetism method is based on magnetic susceptibility studies to measure and classify the degree of pollutant elements that cause changes in the environment. This research was conducted in the area around the fertilizer plant, with five coring points on each track, each coring point a depth of 15 cm. Magnetic susceptibility measurements were performed by in-situ and ex-situ. In-situ measurements were carried out directly by using the SM30 tool by putting the tools on the soil surface at each measurement point and by that obtaining the value of the magnetic susceptibility. Meanwhile, ex-situ measurements are performed in the laboratory by using the Bartington MS2B tool’s susceptibility, which is done on a coring sample which is taken every 5 cm. In-situ measurement shows results that the value of magnetic susceptibility at the surface varies, with the lowest score on the second and fifth points with the -0.81 value and the highest value at the third point, with the score of 0,345. Ex-situ measurements can find out the variations of magnetic susceptibility values at each depth point of coring. At a depth of 0-5 cm, the value of the highest XLF = 494.8 (x10-8m³/kg) is at the third point, while the value of the lowest XLF = 187.1 (x10-8m³/kg) at first. At a depth of 6-10 cm, the highest value of the XLF was at the second point, which was 832.7 (x10-8m³/kg) while the lowest XLF is at the first point, at 211 (x10-8m³/kg). At a depth of 11-15 cm, the XLF’s highest value = 857.7 (x10-8m³/kg) is at the second point, whereas the value of the lowest XLF = 83.3 (x10-8m³/kg) is at the fifth point. Based on the in situ and exsit measurements, it can be seen that the highest magnetic susceptibility values from the surface samples are at the third point.

Keywords: magnetic susceptibility, fertilizer plant, Bartington MS2B, SM30

Procedia PDF Downloads 322
13347 Understanding the Common Antibiotic and Heavy Metal Resistant-Bacterial Load in the Textile Industrial Effluents

Authors: Afroza Parvin, Md. Mahmudul Hasan, Md. Rokunozzaman, Papon Debnath

Abstract:

The effluents of textile industries have considerable amounts of heavy metals, causing potential microbial metal loads if discharged into the environment without treatment. Aim: In this present study, both lactose and non-lactose fermenting bacterial isolates were isolated from textile industrial effluents of a specific region of Bangladesh, named Savar, to compare and understand the load of heavy metals in these microorganisms determining the effects of heavy metal resistance properties on antibiotic resistance. Methods: Five different textile industrial canals of Savar were selected, and effluent samples were collected in 2016 between June to August. Total bacterial colony (TBC) was counted for day 1 to day 5 for 10-6 dilution of samples to 10-10 dilution. All the isolates were isolated and selected using 4 differential media, and tested for the determination of minimum inhibitory concentration (MIC) of heavy metals and antibiotic susceptibility test with plate assay method and modified Kirby-Bauer disc diffusion method, respectively. To detect the combined effect of heavy metals and antibiotics, a binary exposure experiment was performed, and to understand the plasmid profiling plasmid DNA was extracted by alkaline lysis method of some selective isolates. Results: Most of the cases, the colony forming units (CFU) per plate for 50 ul diluted sample were uncountable at 10-6 dilution, however, countable for 10-10 dilution and it didn’t vary much from canal to canal. A total of 50 Shigella, 50 Salmonella, and 100 E.coli (Escherichia coli) like bacterial isolates were selected for this study where the MIC was less than or equal to 0.6 mM for 100% Shigella and Salmonella like isolates, however, only 3% E. coli like isolates had the same MIC for nickel (Ni). The MIC for chromium (Cr) was less than or equal to 2.0 mM for 16% Shigella, 20% Salmonella, and 17% E. coli like isolates. Around 60% of both Shigella and Salmonella, but only 20% of E.coli like isolates had a MIC of less than or equal to 1.2 mM for lead (Pb). The most prevalent resistant pattern for azithromycin (AZM) for Shigella and Salmonella like isolates was found 38% and 48%, respectively; however, for E.coli like isolates, the highest pattern (36%) was found for sulfamethoxazole-trimethoprim (SXT). In the binary exposure experiment, antibiotic zone of inhibition was mostly increased in the presence of heavy metals for all types of isolates. The highest sized plasmid was found 21 Kb and 14 Kb for lactose and non-lactose fermenting isolates, respectively. Conclusion: Microbial resistance to antibiotics and metal ions, has potential health hazards because these traits are generally associated with transmissible plasmids. Microorganisms resistant to antibiotics and tolerant to metals appear as a result of exposure to metal-contaminated environments.

Keywords: antibiotics, effluents, heavy metals, minimum inhibitory concentration, resistance

Procedia PDF Downloads 293
13346 Deep Learning for SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo Ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring. SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, polarimetric SAR image, convolutional neural network, deep learnig, deep neural network

Procedia PDF Downloads 48
13345 Ecological Planning Method of Reclamation Area Based on Ecological Management of Spartina Alterniflora: A Case Study of Xihu Harbor in Xiangshan County

Authors: Dong Yue, Hua Chen

Abstract:

The study region Xihu Harbor in Xiangshan County, Ningbo City is located in the central coast of Zhejiang Province. Concerning the wave dispating issue, Ningbo government firstly introduced Spartina alterniflora in 1980s. In the 1990s, S. alterniflora spread so rapidly thus a ‘grassland’ in the sea has been created nowadays. It has become the most important invasive plant of China’s coastal tidal flats. Although S. alterniflora had some ecological and economic functions, it has also brought series of hazards. It has ecological hazards on many aspects, including biomass and biodiversity, hydrodynamic force and sedimentation process, nutrient cycling of tidal flat, succession sequence of soil and plants and so on. On engineering, it courses problems of poor drainage and channel blocking. On economy, the hazard mainly reflected in the threat on aquaculture industry. The purpose of this study is to explore an ecological, feasible and economical way to manage Spartina alterniflora and use the land formed by it, taking Xihu Harbor in Xiangshan County as a case. Comparison method, mathematical modeling, qualitative and quantitative analysis are utilized to proceed the study. Main outcomes are as follows. By comparing a series of S. alterniflora managing methods which include the combination of mechanical cutting and hydraulic reclamation, waterlogging, herbicide and biological substitution from three standpoints – ecology, engineering and economy. It is inferred that the combination of mechanical cutting and hydraulic reclamation is among the top rank of S. alternifora managing methods. The combination of mechanical cutting and hydraulic reclamation means using large-scale mechanical equipment like large screw seagoing dredger to excavate the S. alterniflora with root and mud together. Then the mix of mud and grass was blown off nearby coastal tidal zone transported by pipelines, which can cushion the silt of tidal zone to form a land. However, as man-made land by coast, the reclamation area’s ecological sensitivity is quite high and will face high possibility of flood threat. Therefore, the reclamation area has many reasonability requirements, including ones on location, specific scope, water surface rate, direction of main watercourse, site of water-gate, the ratio of ecological land to urban construction land. These requirements all became important basis when the planning was being made. The water system planning, green space system planning, road structure and land use all need to accommodate the ecological requests. Besides, the profits from the formed land is the managing project’s source of funding, so how to utilize land efficiently is another considered point in the planning. It is concluded that by aiming at managing a large area of S. alterniflora, the combination of mechanical cutting and hydraulic reclamation is an ecological, feasible and economical method. The planning of reclamation area should fully respect the natural environment and possible disasters. Then the planning which makes land use efficient, reasonable, ecological will promote the development of the area’s city construction.

Keywords: ecological management, ecological planning method, reclamation area, Spartina alternifora, Xihu harbor

Procedia PDF Downloads 293
13344 Effects of Different Drying Methods on the Properties of Viscose Single Jersey Fabrics

Authors: Merve Kucukali Ozturk, Yesim Beceren, Banu Nergis

Abstract:

The study discussed in this paper was conducted in an attempt to investigate effects of different drying methods (line dry and tumble dry) on viscose single jersey fabrics knitted with ring yarn.

Keywords: color change, dimensional properties, drying method, fabric tightness, physical properties

Procedia PDF Downloads 268
13343 A Three-Dimensional TLM Simulation Method for Thermal Effect in PV-Solar Cells

Authors: R. Hocine, A. Boudjemai, A. Amrani, K. Belkacemi

Abstract:

Temperature rising is a negative factor in almost all systems. It could cause by self heating or ambient temperature. In solar photovoltaic cells this temperature rising affects on the behavior of cells. The ability of a PV module to withstand the effects of periodic hot-spot heating that occurs when cells are operated under reverse biased conditions is closely related to the properties of the cell semi-conductor material. In addition, the thermal effect also influences the estimation of the maximum power point (MPP) and electrical parameters for the PV modules, such as maximum output power, maximum conversion efficiency, internal efficiency, reliability, and lifetime. The cells junction temperature is a critical parameter that significantly affects the electrical characteristics of PV modules. For practical applications of PV modules, it is very important to accurately estimate the junction temperature of PV modules and analyze the thermal characteristics of the PV modules. Once the temperature variation is taken into account, we can then acquire a more accurate MPP for the PV modules, and the maximum utilization efficiency of the PV modules can also be further achieved. In this paper, the three-Dimensional Transmission Line Matrix (3D-TLM) method was used to map the surface temperature distribution of solar cells while in the reverse bias mode. It was observed that some cells exhibited an inhomogeneity of the surface temperature resulting in localized heating (hot-spot). This hot-spot heating causes irreversible destruction of the solar cell structure. Hot spots can have a deleterious impact on the total solar modules if individual solar cells are heated. So, the results show clearly that the solar cells are capable of self-generating considerable amounts of heat that should be dissipated very quickly to increase PV module's lifetime.

Keywords: thermal effect, conduction, heat dissipation, thermal conductivity, solar cell, PV module, nodes, 3D-TLM

Procedia PDF Downloads 370
13342 Eliminating Injury in the Work Place and Realizing Vision Zero Using Accident Investigation and Analysis as Method: A Case Study

Authors: Ramesh Kumar Behera, Md. Izhar Hassan

Abstract:

Accident investigation and analysis are useful to identify deficiencies in plant, process, and management practices and formulate preventive strategies for injury elimination. In India and other parts of the world, industrial accidents are investigated to know the causes and also to fulfill legal compliances. However, findings of investigation are seldom used appropriately to strengthen Occupational Safety and Health (OSH) in expected lines. The mineral rich state of Odisha in eastern coast of India; known as a hub for Iron and Steel industries, witnessed frequent accidents during 2005-2009. This article based on study of 982 fatal ‘factory-accidents’ occurred in Odisha during the period 2001-2016, discusses the ‘turnaround-story’ resulting in reduction of fatal accident from 122 in 2009 to 45 in 2016. This paper examines various factors causing incidents; accident pattern in steel and chemical sector; role of climate and harsh weather conditions on accident causation. Software such as R, SQL, MS-Excel and Tableau were used for analysis of data. It is found that maximum fatality is caused due to ‘fall from height’ (24%); steel industries are relatively more accident prone; harsh weather conditions of summer increase chances of accident by 20%. Further, the study suggests that enforcement of partial work-restriction around lunch time during peak summer, screening and training of employees reduce accidents due to fall from height. The study indicates that learning from accident investigation and analysis can be used as a method to reduce work related accidents in the journey towards ‘Vision Zero’.

Keywords: accident investigation and analysis, fatal accidents in India, fall from height, vision zero

Procedia PDF Downloads 134
13341 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach

Authors: Aladdin Al-Tarawneh

Abstract:

The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.

Keywords: Quran translation, hybrid approach, domestication, foreignization, hybrid model

Procedia PDF Downloads 137
13340 Doped and Co-doped ZnO Based Nanoparticles and their Photocatalytic and Gas Sensing Property

Authors: Neha Verma, Manik Rakhra

Abstract:

Statement of the Problem: Nowadays, a tremendous increase in population and advanced industrialization augment the problems related to air and water pollutions. Growing industries promoting environmental danger, which is an alarming threat to the ecosystem. For safeguard, the environment, detection of perilous gases and release of colored wastewater is required for eutrophication pollution. Researchers around the globe are trying their best efforts to save the environment. For this remediation advanced oxidation process is used for potential applications. ZnO is an important semiconductor photocatalyst with high photocatalytic and gas sensing activities. For efficient photocatalytic and gas sensing properties, it is necessary to prepare a doped/co-doped ZnO compound to decrease the electron-hole recombination rates. However, lanthanide doped and co-doped metal oxide is seldom studied for photocatalytic and gas sensing applications. The purpose of this study is to describe the best photocatalyst for the photodegradation of dyes and gas sensing properties. Methodology & Theoretical Orientation: Economical framework has to be used for the synthesis of ZnO. In the depth literature survey, a simple combustion method is utilized for gas sensing and photocatalytic activities. Findings: Rare earth doped and co-doped ZnO nanoparticles were the best photocatalysts for photodegradation of organic dyes and different gas sensing applications by varying various factors such as pH, aging time, and different concentrations of doping and codoping metals in ZnO. Complete degradation of dye was observed only in min. Gas sensing nanodevice showed a better response and quick recovery time for doped/co-doped ZnO. Conclusion & Significance: In order to prevent air and water pollution, well crystalline ZnO nanoparticles were synthesized by rapid and economic method, which is used as photocatalyst for photodegradation of organic dyes and gas sensing applications to sense the release of hazardous gases from the environment.

Keywords: ZnO, photocatalyst, photodegradation of dye, gas sensor

Procedia PDF Downloads 130
13339 Resolution Method for Unforeseen Ground Condition Problem Case in Coal Fired Steam Power Plant Project Location Adipala, Indonesia

Authors: Andi Fallahi, Bona Ryan Situmeang

Abstract:

The Construction Industry is notoriously risky. Much of the preparatory paperwork that precedes construction project can be viewed as the formulation of risk allocation between the owner and the Contractor. The Owner is taking the risk that his project will not get built on the schedule that it will not get built for what he has budgeted and that it will not be of the quality he expected. The Contractor Face a multitude of risk. One of them is an unforeseen condition at the construction site. The Owner usually has the upper hand here if the unforeseen condition occurred. Site data contained in Ground Investigation report is often of significant contractual importance in disputes related to the unforeseen ground condition. A ground investigation can never fully disclose all the details of the underground condition (Risk of an unknown ground condition can never be 100% eliminated). Adipala Coal Fired Steam Power Plant (CSFPP) 1 x 660 project is one of the large CSFPP project in Indonesia based on Engineering, Procurement, and Construction (EPC) Contract. Unforeseen Ground Condition it’s responsible by the Contractor has stipulated in the clausal of Contract. In the implementation, there’s indicated unforeseen ground condition at Circulating Water Pump House (CWPH) area which caused the Contractor should be changed the Method of Work that give big impact against Time of Completion and Cost Project. This paper tries to analyze the best way for allocating the risk between The Owner and The Contractor. All parties that allocating of sharing risk fairly can ultimately save time and money for all parties, and get the job done on schedule for the least overall cost.

Keywords: unforeseen ground condition, coal fired steam power plant, circulating water pump house, Indonesia

Procedia PDF Downloads 310
13338 Pyrolysis of Dursunbey Lignite and Pyrolysis Kinetics

Authors: H. Sütçü, C. Efe

Abstract:

In this study, pyrolysis characteristics of Dursunbey-Balıkesir lignite and its pyrolysis kinetics are examined. The pyrolysis experiments carried out at three different heating rates are performed by using thermogravimetric method. Kinetic parameters are calculated by Coats & Redfern kinetic model and the degree of pyrolysis process is determined for each of the heating rate.

Keywords: lignite, thermogravimetric analysis, pyrolysis, kinetics

Procedia PDF Downloads 339
13337 Parametric Appraisal of Robotic Arc Welding of Mild Steel Material by Principal Component Analysis-Fuzzy with Taguchi Technique

Authors: Amruta Rout, Golak Bihari Mahanta, Gunji Bala Murali, Bibhuti Bhusan Biswal, B. B. V. L. Deepak

Abstract:

The use of industrial robots for performing welding operation is one of the chief sign of contemporary welding in these days. The weld joint parameter and weld process parameter modeling is one of the most crucial aspects of robotic welding. As weld process parameters affect the weld joint parameters differently, a multi-objective optimization technique has to be utilized to obtain optimal setting of weld process parameter. In this paper, a hybrid optimization technique, i.e., Principal Component Analysis (PCA) combined with fuzzy logic has been proposed to get optimal setting of weld process parameters like wire feed rate, welding current. Gas flow rate, welding speed and nozzle tip to plate distance. The weld joint parameters considered for optimization are the depth of penetration, yield strength, and ultimate strength. PCA is a very efficient multi-objective technique for converting the correlated and dependent parameters into uncorrelated and independent variables like the weld joint parameters. Also in this approach, no need for checking the correlation among responses as no individual weight has been assigned to responses. Fuzzy Inference Engine can efficiently consider these aspects into an internal hierarchy of it thereby overcoming various limitations of existing optimization approaches. At last Taguchi method is used to get the optimal setting of weld process parameters. Therefore, it has been concluded the hybrid technique has its own advantages which can be used for quality improvement in industrial applications.

Keywords: robotic arc welding, weld process parameters, weld joint parameters, principal component analysis, fuzzy logic, Taguchi method

Procedia PDF Downloads 161
13336 Management of the Experts in the Research Evaluation System of the University: Based on National Research University Higher School of Economics Example

Authors: Alena Nesterenko, Svetlana Petrikova

Abstract:

Research evaluation is one of the most important elements of self-regulation and development of researchers as it is impartial and independent process of assessment. The method of expert evaluations as a scientific instrument solving complicated non-formalized problems is firstly a scientifically sound way to conduct the assessment which maximum effectiveness of work at every step and secondly the usage of quantitative methods for evaluation, assessment of expert opinion and collective processing of the results. These two features distinguish the method of expert evaluations from long-known expertise widespread in many areas of knowledge. Different typical problems require different types of expert evaluations methods. Several issues which arise with these methods are experts’ selection, management of assessment procedure, proceeding of the results and remuneration for the experts. To address these issues an on-line system was created with the primary purpose of development of a versatile application for many workgroups with matching approaches to scientific work management. Online documentation assessment and statistics system allows: - To realize within one platform independent activities of different workgroups (e.g. expert officers, managers). - To establish different workspaces for corresponding workgroups where custom users database can be created according to particular needs. - To form for each workgroup required output documents. - To configure information gathering for each workgroup (forms of assessment, tests, inventories). - To create and operate personal databases of remote users. - To set up automatic notification through e-mail. The next stage is development of quantitative and qualitative criteria to form a database of experts. The inventory was made so that the experts may not only submit their personal data, place of work and scientific degree but also keywords according to their expertise, academic interests, ORCID, Researcher ID, SPIN-code RSCI, Scopus AuthorID, knowledge of languages, primary scientific publications. For each project, competition assessments are processed in accordance to ordering party demands in forms of apprised inventories, commentaries (50-250 characters) and overall review (1500 characters) in which expert states the absence of conflict of interest. Evaluation is conducted as follows: as applications are added to database expert officer selects experts, generally, two persons per application. Experts are selected according to the keywords; this method proved to be good unlike the OECD classifier. The last stage: the choice of the experts is approved by the supervisor, the e-mails are sent to the experts with invitation to assess the project. An expert supervisor is controlling experts writing reports for all formalities to be in place (time-frame, propriety, correspondence). If the difference in assessment exceeds four points, the third evaluation is appointed. As the expert finishes work on his expert opinion, system shows contract marked ‘new’, managers commence with the contract and the expert gets e-mail that the contract is formed and ready to be signed. All formalities are concluded and the expert gets remuneration for his work. The specificity of interaction of the examination officer with other experts will be presented in the report.

Keywords: expertise, management of research evaluation, method of expert evaluations, research evaluation

Procedia PDF Downloads 189
13335 Important Factors Affecting the Effectiveness of Quality Control Circles

Authors: Sogol Zarafshan

Abstract:

The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.

Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis

Procedia PDF Downloads 114
13334 Delamination Fracture Toughness Benefits of Inter-Woven Plies in Composite Laminates Produced through Automated Fibre Placement

Authors: Jayden Levy, Garth M. K. Pearce

Abstract:

An automated fibre placement method has been developed to build through-thickness reinforcement into carbon fibre reinforced plastic laminates during their production, with the goal of increasing delamination fracture toughness while circumventing the additional costs and defects imposed by post-layup stitching and z-pinning. Termed ‘inter-weaving’, the method uses custom placement sequences of thermoset prepreg tows to distribute regular fibre link regions in traditionally clean ply interfaces. Inter-weaving’s impact on mode I delamination fracture toughness was evaluated experimentally through double cantilever beam tests (ASTM standard D5528-13) on [±15°]9 laminates made from Park Electrochemical Corp. E-752-LT 1/4” carbon fibre prepreg tape. Unwoven and inter-woven automated fibre placement samples were compared to those of traditional laminates produced from standard uni-directional plies of the same material system. Unwoven automated fibre placement laminates were found to suffer a mostly constant 3.5% decrease in mode I delamination fracture toughness compared to flat uni-directional plies. Inter-weaving caused significant local fracture toughness increases (up to 50%), though these were offset by a matching overall reduction. These positive and negative behaviours of inter-woven laminates were respectively found to be caused by fibre breakage and matrix deformation at inter-weave sites, and the 3D layering of inter-woven ply interfaces providing numerous paths of least resistance for crack propagation.

Keywords: AFP, automated fibre placement, delamination, fracture toughness, inter-weaving

Procedia PDF Downloads 167
13333 Regret-Regression for Multi-Armed Bandit Problem

Authors: Deyadeen Ali Alshibani

Abstract:

In the literature, the multi-armed bandit problem as a statistical decision model of an agent trying to optimize his decisions while improving his information at the same time. There are several different algorithms models and their applications on this problem. In this paper, we evaluate the Regret-regression through comparing with Q-learning method. A simulation on determination of optimal treatment regime is presented in detail.

Keywords: optimal, bandit problem, optimization, dynamic programming

Procedia PDF Downloads 429
13332 Optimum Dewatering Network Design Using Firefly Optimization Algorithm

Authors: S. M. Javad Davoodi, Mojtaba Shourian

Abstract:

Groundwater table close to the ground surface causes major problems in construction and mining operation. One of the methods to control groundwater in such cases is using pumping wells. These pumping wells remove excess water from the site project and lower the water table to a desirable value. Although the efficiency of this method is acceptable, it needs high expenses to apply. It means even small improvement in a design of pumping wells can lead to substantial cost savings. In order to minimize the total cost in the method of pumping wells, a simulation-optimization approach is applied. The proposed model integrates MODFLOW as the simulation model with Firefly as the optimization algorithm. In fact, MODFLOW computes the drawdown due to pumping in an aquifer and the Firefly algorithm defines the optimum value of design parameters which are numbers, pumping rates and layout of the designing wells. The developed Firefly-MODFLOW model is applied to minimize the cost of the dewatering project for the ancient mosque of Kerman city in Iran. Repetitive runs of the Firefly-MODFLOW model indicates that drilling two wells with the total rate of pumping 5503 m3/day is the result of the minimization problem. Results show that implementing the proposed solution leads to at least 1.5 m drawdown in the aquifer beneath mosque region. Also, the subsidence due to groundwater depletion is less than 80 mm. Sensitivity analyses indicate that desirable groundwater depletion has an enormous impact on total cost of the project. Besides, in a hypothetical aquifer decreasing the hydraulic conductivity contributes to decrease in total water extraction for dewatering.

Keywords: groundwater dewatering, pumping wells, simulation-optimization, MODFLOW, firefly algorithm

Procedia PDF Downloads 275
13331 Some Extreme Halophilic Microorganisms Produce Extracellular Proteases with Long Lasting Tolerance to Ethanol Exposition

Authors: Cynthia G. Esquerre, Amparo Iris Zavaleta

Abstract:

Extremophiles constitute a potentially valuable source of proteases for the development of biotechnological processes; however, the number of available studies in the literature is limited compared to mesophilic counterparts. Therefore, in this study, Peruvian halophilic microorganisms were characterized to select suitable proteolytic strains that produce active proteases under exigent conditions. Proteolysis was screened using the streak plate method with gelatin or skim milk as substrates. After that, proteolytic microorganisms were selected for phenotypic characterization and screened by a semi-quantitative proteolytic test using a modified method of diffusion agar. Finally, proteolysis was evaluated using partially purified extracts by ice-cold ethanol precipitation and dialysis. All analyses were carried out over a wide range of NaCl concentrations, pH, temperature and substrates. Of a total of 60 strains, 21 proteolytic strains were selected, of these 19 were extreme halophiles and 2 were moderates. Most proteolytic strains demonstrated differences in their biochemical patterns, particularly in sugar fermentation. A total of 14 microorganisms produced extracellular proteases, 13 were neutral, and one was alkaline showing activity up to pH 9.0. Proteases hydrolyzed gelatin as the most specific substrate. In general, catalytic activity was efficient under a wide range of NaCl (1 to 4 M NaCl), temperature (37 to 55 °C) and after an ethanol exposition performed at -20 °C for 24 hours. In conclusion, this study reported 14 candidates extremely halophiles producing extracellular proteases capable of being stable and active on a wide range of NaCl, temperature and even long lasting ethanol exposition.

Keywords: biotechnological processes, ethanol exposition, extracellular proteases, extremophiles

Procedia PDF Downloads 268
13330 Two Component Source Apportionment Based on Absorption and Size Distribution Measurement

Authors: Tibor Ajtai, Noémi Utry, Máté Pintér, Gábor Szabó, Zoltán Bozóki

Abstract:

Beyond its climate and health related issues ambient light absorbing carbonaceous particulate matter (LAC) has also become a great scientific interest in terms of its regulations recently. It has been experimentally demonstrated in recent studies, that LAC is dominantly composed of traffic and wood burning aerosol particularly under wintertime urban conditions, when the photochemical and biological activities are negligible. Several methods have been introduced to quantitatively apportion aerosol fractions emitted by wood burning and traffic but most of them require costly and time consuming off-line chemical analysis. As opposed to chemical features, the microphysical properties of airborne particles such as optical absorption and size distribution can be easily measured on-line, with high accuracy and sensitivity, especially under highly polluted urban conditions. Recently a new method has been proposed for the apportionment of wood burning and traffic aerosols based on the spectral dependence of their absorption quantified by the Aerosol Angström Exponent (AAE). In this approach the absorption coefficient is deduced from transmission measurement on a filter accumulated aerosol sample and the conversion factor between the measured optical absorption and the corresponding mass concentration (the specific absorption cross section) are determined by on-site chemical analysis. The recently developed multi-wavelength photoacoustic instruments provide novel, in-situ approach towards the reliable and quantitative characterization of carbonaceous particulate matter. Therefore, it also opens up novel possibilities on the source apportionment through the measurement of light absorption. In this study, we demonstrate an in-situ spectral characterization method of the ambient carbon fraction based on light absorption and size distribution measurements using our state-of-the-art multi-wavelength photoacoustic instrument (4λ-PAS) and Single Mobility Particle Sizer (SMPS) The carbonaceous particulate selective source apportionment study was performed for ambient particulate matter in the city center of Szeged, Hungary where the dominance of traffic and wood burning aerosol has been experimentally demonstrated earlier. The proposed model is based on the parallel, in-situ measurement of optical absorption and size distribution. AAEff and AAEwb were deduced from the measured data using the defined correlation between the AOC(1064nm)/AOC(266nm) and N100/N20 ratios. σff(λ) and σwb(λ) were determined with the help of the independently measured temporal mass concentrations in the PM1 mode. Furthermore, the proposed optical source apportionment is based on the assumption that the light absorbing fraction of PM is exclusively related to traffic and wood burning. This assumption is indirectly confirmed here by the fact that the measured size distribution is composed of two unimodal size distributions identified to correspond to traffic and wood burning aerosols. The method offers the possibility of replacing laborious chemical analysis with simple in-situ measurement of aerosol size distribution data. The results by the proposed novel optical absorption based source apportionment method prove its applicability whenever measurements are performed at an urban site where traffic and wood burning are the dominant carbonaceous sources of emission.

Keywords: absorption, size distribution, source apportionment, wood burning, traffic aerosol

Procedia PDF Downloads 210
13329 Most Recent Lifespan Estimate for the Itaipu Hydroelectric Power Plant Computed by Using Borland and Miller Method and Mass Balance in Brazil, Paraguay

Authors: Anderson Braga Mendes

Abstract:

Itaipu Hydroelectric Power Plant is settled on the Paraná River, which is a natural boundary between Brazil and Paraguay; thus, the facility is shared by both countries. Itaipu Power Plant is the biggest hydroelectric generator in the world, and provides clean and renewable electrical energy supply for 17% and 76% of Brazil and Paraguay, respectively. The plant started its generation in 1984. It counts on 20 Francis turbines and has installed capacity of 14,000 MWh. Its historic generation record occurred in 2016 (103,098,366 MWh), and since the beginning of its operation until the last day of 2016 the plant has achieved the sum of 2,415,789,823 MWh. The distinct sedimentologic aspects of the drainage area of Itaipu Power Plant, from its stretch upstream (Porto Primavera and Rosana dams) to downstream (Itaipu dam itself), were taken into account in order to best estimate the increase/decrease in the sediment yield by using data from 2001 to 2016. Such data are collected through a network of 14 automatic sedimentometric stations managed by the company itself and operating in an hourly basis, covering an area of around 136,000 km² (92% of the incremental drainage area of the undertaking). Since 1972, a series of lifespan studies for the Itaipu Power Plant have been made, being first assessed by Sir Hans Albert Einstein, at the time of the feasibility studies for the enterprise. From that date onwards, eight further studies were made through the last 44 years aiming to confer more precision upon the estimates based on more updated data sets. From the analysis of each monitoring station, it was clearly noticed strong increase tendencies in the sediment yield through the last 14 years, mainly in the Iguatemi, Ivaí, São Francisco Falso and Carapá Rivers, the latter situated in Paraguay, whereas the others are utterly in Brazilian territory. Five lifespan scenarios considering different sediment yield tendencies were simulated with the aid of the softwares SEDIMENT and DPOSIT, both developed by the author of the present work. Such softwares thoroughly follow the Borland & Miller methodology (empirical method of area-reduction). The soundest scenario out of the five ones under analysis indicated a lifespan foresight of 168 years, being the reservoir only 1.8% silted by the end of 2016, after 32 years of operation. Besides, the mass balance in the reservoir (water inflows minus outflows) between 1986 and 2016 shows that 2% of the whole Itaipu lake is silted nowadays. Owing to the convergence of both results, which were acquired by using different methodologies and independent input data, it is worth concluding that the mathematical modeling is satisfactory and calibrated, thus assigning credibility to this most recent lifespan estimate.

Keywords: Borland and Miller method, hydroelectricity, Itaipu Power Plant, lifespan, mass balance

Procedia PDF Downloads 255
13328 Research of Seepage Field and Slope Stability Considering Heterogeneous Characteristics of Waste Piles: A Less Costly Way to Reduce High Leachate Levels and Avoid Accidents

Authors: Serges Mendomo Meye, Li Guowei, Shen Zhenzhong, Gan Lei, Xu Liqun

Abstract:

Due to the characteristics of high-heap and large-volume, the complex layers of waste and the high-water level of leachate, environmental pollution, and slope instability are easily produced. It is therefore of great significance to research the heterogeneous seepage field and stability of landfills. This paper focuses on the heterogeneous characteristics of the landfill piles and analyzes the seepage field and slope stability of the landfill using statistical and numerical analysis methods. The calculated results are compared with the field measurement and literature research data to verify the reliability of the model, which may provide the basis for the design, safe, and eco-friendly operation of the landfill. The main innovations are as follows: (1) The saturated-unsaturated seepage equation of heterogeneous soil is derived theoretically. The heterogeneous landfill is regarded as composed of infinite layers of homogeneous waste, and a method for establishing the heterogeneous seepage model is proposed. Then the formation law of the stagnant water level of heterogeneous landfills is studied. It is found that the maximum stagnant water level of landfills is higher when considering the heterogeneous seepage characteristics, which harms the stability of landfills. (2) Considering the heterogeneity weight and strength characteristics of waste, a method of establishing a heterogeneous stability model is proposed, and it is extended to the three-dimensional stability study. It is found that the distribution of heterogeneous characteristics has a great influence on the stability of landfill slope. During the operation and management of the landfill, the reservoir bank should also be considered while considering the capacity of the landfill.

Keywords: heterogeneous characteristics, leachate levels, saturated-unsaturated seepage, seepage field, slope stability

Procedia PDF Downloads 228
13327 Reaching a Mobile and Dynamic Nose after Rhinoplasty: A Pilot Study

Authors: Guncel Ozturk

Abstract:

Background: Rhinoplasty is the most commonly performed cosmetic operations in plastic surgery. Maneuvers used in rhinoplasty lead to a firm and stiff nasal tip in the early postoperative months. This unnatural stability of the nose may easily cause distortion in the reshaped nose after severe trauma. Moreover, a firm nasal tip may cause difficulties in performing activities such as touching, hugging, or kissing. Decreasing the stability and increasing the mobility of the nasal tip would help rhinoplasty patients to avoid these small but relatively important problems. Methods: We use delivery approach with closed rhinoplasty and changed positions of intranasal incisions to reach a dynamic and mobile nose. A total of 203 patients who had undergone primary closed rhinoplasty in private practice were inspected retrospectively. Posterior strut flap that was connected with connective tissues in the caudal of septum and the medial crurals were formed. Cartilage of the posterior strut graft was left 2 mm thick in the distal part of septum, it was cut vertically, and the connective tissue in the distal part was preserved. Results: The median patient age was 24 (range 17-42) years. The median follow-up period was15.2 (range12-26) months. Patient satisfaction was assessed with the 'Rhinoplasty Outcome Evaluation' (ROE) questionnaire. Twelve months after surgeries, 87.5% of patients reported excellent outcomes, according to ROE. Conclusion: The soft tissue connections between that segment and surrounding structures should be preserved to save the support of the tip while having a mobile tip at the same time with this method. These modifications would access to a mobile, non-stiff, and dynamic nasal tip in the early postoperative months. Further and prospective studies should be performed for supporting this method.

Keywords: closed rhinoplasty, dynamic, mobile, tip

Procedia PDF Downloads 113
13326 Crop Leaf Area Index (LAI) Inversion and Scale Effect Analysis from Unmanned Aerial Vehicle (UAV)-Based Hyperspectral Data

Authors: Xiaohua Zhu, Lingling Ma, Yongguang Zhao

Abstract:

Leaf Area Index (LAI) is a key structural characteristic of crops and plays a significant role in precision agricultural management and farmland ecosystem modeling. However, LAI retrieved from different resolution data contain a scaling bias due to the spatial heterogeneity and model non-linearity, that is, there is scale effect during multi-scale LAI estimate. In this article, a typical farmland in semi-arid regions of Chinese Inner Mongolia is taken as the study area, based on the combination of PROSPECT model and SAIL model, a multiple dimensional Look-Up-Table (LUT) is generated for multiple crops LAI estimation from unmanned aerial vehicle (UAV) hyperspectral data. Based on Taylor expansion method and computational geometry model, a scale transfer model considering both difference between inter- and intra-class is constructed for scale effect analysis of LAI inversion over inhomogeneous surface. The results indicate that, (1) the LUT method based on classification and parameter sensitive analysis is useful for LAI retrieval of corn, potato, sunflower and melon on the typical farmland, with correlation coefficient R2 of 0.82 and root mean square error RMSE of 0.43m2/m-2. (2) The scale effect of LAI is becoming obvious with the decrease of image resolution, and maximum scale bias is more than 45%. (3) The scale effect of inter-classes is higher than that of intra-class, which can be corrected efficiently by the scale transfer model established based Taylor expansion and Computational geometry. After corrected, the maximum scale bias can be reduced to 1.2%.

Keywords: leaf area index (LAI), scale effect, UAV-based hyperspectral data, look-up-table (LUT), remote sensing

Procedia PDF Downloads 423
13325 Deep Learning Based Polarimetric SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry

Procedia PDF Downloads 65
13324 Event Data Representation Based on Time Stamp for Pedestrian Detection

Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita

Abstract:

In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.

Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption

Procedia PDF Downloads 75
13323 Telemedicine in Physician Assistant Education: A Partnership with Community Agency

Authors: Martina I. Reinhold, Theresa Bacon-Baguley

Abstract:

A core challenge of physician assistant education is preparing professionals for lifelong learning. While this conventionally has encompassed scientific advances, students must also embrace new care delivery models and technologies. Telemedicine, the provision of care via two-way audio and video, is an example of a technological advance reforming health care. During a three-semester sequence of Hospital Community Experiences, physician assistant students were assigned experiences with Answer Health on Demand, a telemedicine collaborative. Preceding the experiences, the agency lectured on the application of telemedicine. Students were then introduced to the technology and partnered with a provider. Prior to observing the patient-provider interaction, patient consent was obtained. Afterwards, students completed a reflection paper on lessons learned and the potential impact of telemedicine on their careers. Thematic analysis was completed on the students’ reflection papers (n=13). Preceding the lecture and experience, over 75% of students (10/13) were unaware of telemedicine. Several stated they were 'skeptical' about the effectiveness of 'impersonal' health care appointments. After the experience, all students remarked that telemedicine will play a large role in the future of healthcare and will provide benefits by improving access in rural areas, decreasing wait time, and saving cost. More importantly, 30% of students (4/13) commented that telemedicine is a technology they can see themselves using in their future practice. Initial results indicate that collaborative interaction between students and telemedicine providers enhanced student learning and exposed students to technological advances in the delivery of care. Further, results indicate that students perceived telemedicine more favorably as a viable delivery method after the experience.

Keywords: collaboration, physician assistant education, teaching innovative health care delivery method, telemedicine

Procedia PDF Downloads 179
13322 Self-Supervised Attributed Graph Clustering with Dual Contrastive Loss Constraints

Authors: Lijuan Zhou, Mengqi Wu, Changyong Niu

Abstract:

Attributed graph clustering can utilize the graph topology and node attributes to uncover hidden community structures and patterns in complex networks, aiding in the understanding and analysis of complex systems. Utilizing contrastive learning for attributed graph clustering can effectively exploit meaningful implicit relationships between data. However, existing attributed graph clustering methods based on contrastive learning suffer from the following drawbacks: 1) Complex data augmentation increases computational cost, and inappropriate data augmentation may lead to semantic drift. 2) The selection of positive and negative samples neglects the intrinsic cluster structure learned from graph topology and node attributes. Therefore, this paper proposes a method called self-supervised Attributed Graph Clustering with Dual Contrastive Loss constraints (AGC-DCL). Firstly, Siamese Multilayer Perceptron (MLP) encoders are employed to generate two views separately to avoid complex data augmentation. Secondly, the neighborhood contrastive loss is introduced to constrain node representation using local topological structure while effectively embedding attribute information through attribute reconstruction. Additionally, clustering-oriented contrastive loss is applied to fully utilize clustering information in global semantics for discriminative node representations, regarding the cluster centers from two views as negative samples to fully leverage effective clustering information from different views. Comparative clustering results with existing attributed graph clustering algorithms on six datasets demonstrate the superiority of the proposed method.

Keywords: attributed graph clustering, contrastive learning, clustering-oriented, self-supervised learning

Procedia PDF Downloads 20