Search results for: conventional techniques
8012 Application of GIS Techniques for Analysing Urban Built-Up Growth of Class-I Indian Cities: A Case Study of Surat
Authors: Purba Biswas, Priyanka Dey
Abstract:
Worldwide rapid urbanisation has accelerated city expansion in both developed and developing nations. This unprecedented urbanisation trend due to the increasing population and economic growth has caused challenges for the decision-makers in city planning and urban management. Metropolitan cities, class-I towns, and major urban centres undergo a continuous process of evolution due to interaction between socio-cultural and economic attributes. This constant evolution leads to urban expansion in all directions. Understanding the patterns and dynamics of urban built-up growth is crucial for policymakers, urban planners, and researchers, as it aids in resource management, decision-making, and the development of sustainable strategies to address the complexities associated with rapid urbanisation. Identifying spatio-temporal patterns of urban growth has emerged as a crucial challenge in monitoring and assessing present and future trends in urban development. Analysing urban growth patterns and tracking changes in land use is an important aspect of urban studies. This study analyses spatio-temporal urban transformations and land-use and land cover changes using remote sensing and GIS techniques. Built-up growth analysis has been done for the city of Surat as a case example, using the GIS tools of NDBI and GIS models of the Built-up Urban Density Index and Shannon Entropy Index to identify trends and the geographical direction of transformation from 2005 to 2020. Surat is one of the fastest-growing urban centres in both the state and the nation, ranking as the 4th fastest-growing city globally. This study analyses the dynamics of urban built-up area transformations both zone-wise and geographical direction-wise, in which their trend, rate, and magnitude were calculated for the period of 15 years. This study also highlights the need for analysing and monitoring the urban growth pattern of class-I cities in India using spatio-temporal and quantitative techniques like GIS for improved urban management.Keywords: urban expansion, built-up, geographic information system, remote sensing, Shannon’s entropy
Procedia PDF Downloads 728011 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).Keywords: motion detection, motion tracking, trajectory analysis, video surveillance
Procedia PDF Downloads 5488010 Integral Image-Based Differential Filters
Authors: Kohei Inoue, Kenji Hara, Kiichi Urahama
Abstract:
We describe a relationship between integral images and differential images. First, we derive a simple difference filter from conventional integral image. In the derivation, we show that an integral image and the corresponding differential image are related to each other by simultaneous linear equations, where the numbers of unknowns and equations are the same, and therefore, we can execute the integration and differentiation by solving the simultaneous equations. We applied the relationship to an image fusion problem, and experimentally verified the effectiveness of the proposed method.Keywords: integral images, differential images, differential filters, image fusion
Procedia PDF Downloads 5068009 Design of an Ensemble Learning Behavior Anomaly Detection Framework
Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia
Abstract:
Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.Keywords: cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing
Procedia PDF Downloads 1288008 Metal Binding Phage Clones in a Quest for Heavy Metal Recovery from Water
Authors: Tomasz Łęga, Marta Sosnowska, Mirosława Panasiuk, Lilit Hovhannisyan, Beata Gromadzka, Marcin Olszewski, Sabina Zoledowska, Dawid Nidzworski
Abstract:
Toxic heavy metal ion contamination of industrial wastewater has recently become a significant environmental concern in many regions of the world. Although the majority of heavy metals are naturally occurring elements found on the earth's surface, anthropogenic activities such as mining and smelting, industrial production, and agricultural use of metals and metal-containing compounds are responsible for the majority of environmental contamination and human exposure. The permissible limits (ppm) for heavy metals in food, water and soil are frequently exceeded and considered hazardous to humans, other organisms, and the environment as a whole. Human exposure to highly nickel-polluted environments causes a variety of pathologic effects. In 2008, nickel received the shameful name of “Allergen of the Year” (GILLETTE 2008). According to the dermatologist, the frequency of nickel allergy is still growing, and it can’t be explained only by fashionable piercing and nickel devices used in medicine (like coronary stents and endoprostheses). Effective remediation methods for removing heavy metal ions from soil and water are becoming increasingly important. Among others, methods such as chemical precipitation, micro- and nanofiltration, membrane separation, conventional coagulation, electrodialysis, ion exchange, reverse and forward osmosis, photocatalysis and polymer or carbon nanocomposite absorbents have all been investigated so far. The importance of environmentally sustainable industrial production processes and the conservation of dwindling natural resources has highlighted the need for affordable, innovative biosorptive materials capable of recovering specific chemical elements from dilute aqueous solutions. The use of combinatorial phage display techniques for selecting and recognizing material-binding peptides with a selective affinity for any target, particularly inorganic materials, has gained considerable interest in the development of advanced bio- or nano-materials. However, due to the limitations of phage display libraries and the biopanning process, the accuracy of molecular recognition for inorganic materials remains a challenge. This study presents the isolation, identification and characterisation of metal binding phage clones that preferentially recover nickel.Keywords: Heavy metal recovery, cleaning water, phage display, nickel
Procedia PDF Downloads 998007 Segmentation of Liver Using Random Forest Classifier
Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir
Abstract:
Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.Keywords: CT images, image validation, random forest, segmentation
Procedia PDF Downloads 3138006 Application of Compressed Sensing Method for Compression of Quantum Data
Authors: M. Kowalski, M. Życzkowski, M. Karol
Abstract:
Current quantum key distribution systems (QKD) offer low bit rate of up to single MHz. Compared to conventional optical fiber links with multiple GHz bitrates, parameters of recent QKD systems are significantly lower. In the article we present the conception of application of the Compressed Sensing method for compression of quantum information. The compression methodology as well as the signal reconstruction method and initial results of improving the throughput of quantum information link are presented.Keywords: quantum key distribution systems, fiber optic system, compressed sensing
Procedia PDF Downloads 6948005 Developing Cause-effect Model of Urban Resilience versus Flood in Karaj City using TOPSIS and Shannon Entropy Techniques
Authors: Mohammad Saber Eslamlou, Manouchehr Tabibian, Mahta Mirmoghtadaei
Abstract:
The history of urban development and the increasing complexities of urban life have long been intertwined with different natural and man-made disasters. Sometimes, these unpleasant events have destroyed the cities forever. The growth of the urban population and the increase of social and economic resources in the cities increased the importance of developing a holistic approach to dealing with unknown urban disasters. As a result, the interest in resilience has increased in most of the scientific fields, and the urban planning literature has been enriched with the studies of the social, economic, infrastructural, and physical abilities of the cities. In this regard, different conceptual frameworks and patterns have been developed focusing on dimensions of resilience and different kinds of disasters. As the most frequent and likely natural disaster in Iran is flooding, the present study aims to develop a cause-effect model of urban resilience against flood in Karaj City. In this theoretical study, desk research and documentary studies were used to find the elements and dimensions of urban resilience. In this regard, 6 dimensions and 32 elements were found for urban resilience and a questionnaire was made by considering the requirements of TOPSIS techniques (pairwise comparison). The sample of the research consisted of 10 participants who were faculty members, academicians, board members of research centers, managers of the Ministry of Road and Urban Development, board members of New Towns Development Company, experts, and practitioners of consulting companies who had scientific and research backgrounds. The gathered data in this survey were analyzed using TOPSIS and Shannon Entropy techniques. The results show that Infrastructure/Physical, Social, Organizational/ Institutional, Structural/Physical, Economic, and Environmental dimensions are the most effective factors in urban resilience against floods in Karaj, respectively. Finally, a comprehensive model and a systematic framework of factors that affect the urban resilience of Karaj against floods was developed. This cause – effect model shows how different factors are related and influence each other, based on their connected structure and preferences.Keywords: urban resilience, TOPSIS, Shannon entropy, cause-effect model of resilience, flood
Procedia PDF Downloads 588004 Intensified Electrochemical H₂O₂ Synthesis and Highly Efficient Pollutant Removal Enabled by Nickel Oxides with Surface Engineered Facets and Vacancies
Authors: Wenjun Zhang, Thao Thi Le, Dongyup Shin, Jong Min Kim
Abstract:
Electrochemical hydrogen peroxide (H₂O₂) synthesis holds significant promise for decentralized environmental remediation through the electro-Fenton process. However, challenges persist, such as the absence of robust electrocatalysts for the selective two-electron oxygen reduction reaction (2e⁻ ORR) and the high cost and sluggish kinetics of conventional electro-Fenton systems in treating highly concentrated wastewater. This study introduces an efficient water treatment system for removing substantial quantities of organic pollutants using an advanced electro-Fenton system coupled with a high-valent NiO catalyst. By employing a precipitation method involving crystal facet and cation vacancy engineering, a trivalent Ni (Ni³⁺)-rich NiO catalyst with a (111)-domain-exposed crystal facet, named {111}-NivO, was synthesized. This catalyst exhibited a remarkable 96% selectivity and a high mass activity of 59 A g⁻¹ for H₂O₂ production, outperforming all previously reported Ni-based catalysts. Furthermore, an advanced electro-Fenton system, integrated with a flow cell for electrochemical H₂O₂ production, was utilized to achieve 100% removal of 50 ppm bisphenol A (BPA) in 200 mL of wastewater under heavy-duty conditions, reaching a superior rapid degradation rate (4 min, k = 1.125 min⁻¹), approximately 102 times faster than the conventional electro-Fenton system. The hyper-efficiency is attributed to the continuous and appropriate supply of H₂O₂, the provision of O₂, and the timely recycling of the electrolyte under high current density operation. This catalyst also demonstrated a 93% removal of total organic carbon after 2 hours of operation and can be applied for efficient removal of highly concentrated phenol pollutants from aqueous systems, which opens new avenues for wastewater treatment.Keywords: hydrogen peroxide production, nickel oxides, crystal facet and cation vacancy engineering, wastewater treatment, flow cell, electro-Fenton
Procedia PDF Downloads 598003 A Compact Ultra-Wide Band Antenna with C-Shaped Slot for WLAN Notching
Authors: Maryam Rasool, Farhan Munir, Fahad Nawaz, Saad Ahmad
Abstract:
A patch antenna operating in the Ultra-Wide Band of frequency (3.1 GHz – 10.6 GHz) is designed with enhanced security from interference from other applications by incorporating the notching technique. Patch antennas in the Ultra-Wide Band are becoming widely famous due to their low power, light weight and high data rate capability. Micro strip patch antenna’s patch can be altered to increase its bandwidth and introduce UWB character in it. The designed antenna is a patch antenna consisting of a conductive sheet of metal mounted over a large sheet of metal called the ground plane with a substrate separating the two. Notched bands are public safety WLAN, WLAN and FSS. Different techniques used to implement the UWB antenna were individually implemented and there results were examined. V shaped patch was then chosen and modified to an arrow shaped patch to give the optimized results operating on the entire UWB region with considerable return loss. The frequency notch prevents the operation of the antenna at a particular range of frequency, hence minimizing interference from other systems. There are countless techniques for introducing the notch but we have used inverted C-shaped slots in the UWB patch to get the notch characteristics as output and also wavelength resonators to introduce notch in UWB band. The designed antenna is simulated in High Frequency Structural Simulator (HFSS) 13.0 by Ansoft.Keywords: HFSS, Notch, UWB, WLAN
Procedia PDF Downloads 4178002 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases
Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar
Abstract:
Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning
Procedia PDF Downloads 1208001 Vertical and Lateral Vibration Analysis of Conventional Elevator
Authors: Mohammadreza Saviz, Sina Najafian
Abstract:
This paper presents an analytical study of vibration moving elevator and shows the elevator 2D dynamic model to evaluate the vertical and lateral motion. Most elevators applied to tall buildings include compensating ropes to satisfy the balanced rope tension between the car and the counterweight. The elasticity of these ropes and springs of sets that connect cabin to ropes make the elevator car to vibrate. A two-dimensional model is derived to calculate vibrations and displacements. The simulation results were validated by the results of similar works.Keywords: elevator, vibration, simulation, analytical solution, 2D modeling
Procedia PDF Downloads 3058000 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground
Authors: Bhim Kumar Dahal
Abstract:
Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies. Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication. And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.Keywords: cement, improvement, physical properties, strength
Procedia PDF Downloads 1747999 Noise Barrier Technique as a Way to Improve the Sonic Urban Environment along Existing Roadways Assessment: El-Gish Road Street, Alexandria, Egypt
Authors: Nihal Atif Salim
Abstract:
To improve the quality of life in cities, a variety of interventions are used. Noise is a substantial and important sort of pollution that has a negative impact on the urban environment and human health. According to the complaint survey, it ranks second among environmental contamination complaints (conducted by EEAA in 2019). The most significant source of noise in the city is traffic noise. In order to improve the sound urban environment, many physical techniques are applied. In the local area, noise barriers are considered as one of the most appropriate physical techniques along existing traffic routes. Alexandria is Egypt's second-largest city after Cairo. It is located along the Mediterranean Sea, and El- Gish Road is one of the city's main arteries. It impacts the waterfront promenade that extends along with the city by a high level of traffic noise. The purpose of this paper is to clarify the design considerations for the most appropriate noise barrier type along with the promenade, with the goal of improving the Quality of Life (QOL) and the sonic urban environment specifically. The proposed methodology focuses on how noise affects human perception and the environment. Then it delves into the various physical noise control approaches. After that, the paper discusses sustainable design decisions making. Finally, look into the importance of incorporating sustainability into design decisions making. Three stages will be followed in the case study. The first stage involves doing a site inspection and using specific sound measurement equipment (a noise level meter) to measure the noise level along the promenade at many sites, and the findings will be shown on a noise map. The second step is to inquire about the site's user experience. The third step is to investigate the various types of noise barriers and their effects on QOL along existing routes in order to select the most appropriate type. The goal of this research is to evaluate the suitable design of noise barriers that fulfill environmental and social perceptions while maintaining a balanced approach to the noise issue in order to improve QOL along existing roadways in the local area.Keywords: noise pollution, sonic urban environment, traffic noise, noise barrier, acoustic sustainability, noise reduction techniques
Procedia PDF Downloads 1387998 A Wide View Scheme for Automobile's Black Box
Authors: Jaemyoung Lee
Abstract:
We propose a wide view camera scheme for automobile's black box. The proposed scheme uses the commercially available camera lenses of which view angles are about 120°}^{\circ}°. In the proposed scheme, we extend the view angle to approximately 200° ^{\circ}° using two cameras at the front side instead of three lenses with conventional black boxes.Keywords: camera, black box, view angle, automobile
Procedia PDF Downloads 4137997 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach
Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika
Abstract:
Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments
Procedia PDF Downloads 2197996 The Climate Change and Soil Degradation in the Czech Republic
Authors: Miroslav Dumbrovsky
Abstract:
The paper deals with impacts of climate change with the main emphasis on land degradation, agriculture and forestry management in the landscape. Land degradation, due to adverse effect of farmers activities, as a result of inappropriate conventional technologies, was a major issue in the Czech Republic during the 20th century and will remain for solving in the 21st century. The importance of land degradation is very high because of its impact on crop productivity and many other adverse effects. Land degradation through soil degradation is causing losses on crop productivity and quality of the environment, through decreasing quality of soil and water (especially water resources). Negative effects of conventional farming practices are increased water erosion, as well as crusting and compaction of the topsoil and subsoil. Soil erosion caused by water destructs the soil’s structure, reduces crop productivity due to deterioration in soil physical and chemical properties such as infiltration rate, water-holding capacity, loss of nutrients needed for crop production, and loss of soil carbon. Water erosion occurs on fields with row crops (maize, sunflower), especially during the rainfall period from April to October. Recently there is a serious problem of greatly expanded production of biofuels and bioenergy from field crops. The result is accelerated soil degradation. The damages (on and off- site) are greater than the benefits. An effective soil conservation requires an appropriate complex system of measures in the landscape. They are also important to continue to develop new sophisticated methods and technologies for decreasing land degradation. The system of soil conservation solving land degradation depend on the ability and the willingness of land users to apply them. When we talk about land degradation, it is not just a technical issue but also an economic and political issue. From a technical point of view, we have already made many positive steps, but for successful solving the problem of land degradation is necessary to develop suitable economic and political tools to increase the willingness and ability of land users to adopt conservation measures.Keywords: land degradation, soil erosion, soil conservation, climate change
Procedia PDF Downloads 3757995 A Seven Year Single-Centre Study of Dental Implant Survival in Head and Neck Oncology Patients
Authors: Sidra Suleman, Maliha Suleman, Stephen Brindley
Abstract:
Oral rehabilitation of head and neck cancer patients plays a crucial role in the quality of life for such individuals post-treatment. Placement of dental implants or implant-retained prostheses can help restore oral function and aesthetics, which is often compromised following surgery. Conventional prosthodontic techniques can be insufficient in rehabilitating such patients due to their altered anatomy and reduced oral competence. Hence, there is a strong clinical need for the placement of dental implants. With an increasing incidence of head and neck cancer patients, the demand for such treatment is rising. Aim: The aim of the study was to determine the survival rate of dental implants in head and neck cancer patients placed at the Restorative and Maxillofacial Department, Royal Stoke University Hospital (RSUH), United Kingdom. Methodology: All patients who received dental implants between January 1, 2013 to December 31, 2020 were identified. Patients were excluded based on three criteria: 1) non-head and neck cancer patients, 2) no outpatient follow-up post-implant placement 3) provision of non-dental implants. Scanned paper notes and electronic records were extracted and analyzed. Implant survival was defined as fixtures that had remained in-situ / not required removal. Sample: Overall, 61 individuals were recruited from the 143 patients identified. The mean age was 64.9 years, with a range of 35 – 89 years. The sample included 37 (60.7%) males and 24 (39.3%) females. In total, 211 implants were placed, of which 40 (19.0%) were in the maxilla, 152 (72.0%) in the mandible and 19 (9.0%) in autogenous bone graft sites. Histologically 57 (93.4%) patients had squamous cell carcinoma, with 43 (70.5%) patients having either stage IVA or IVB disease. As part of treatment, 42 (68.9%) patients received radiotherapy, which was carried out post-operatively for 29 (69.0%) cases. Whereas 21 (34.4%) patients underwent chemotherapy, 13 (61.9%) of which were post-operative. The Median follow-up period was 21.9 months with a range from 0.9 – 91.4 months. During the study, 23 (37.7%) patients died and their data was censored beyond the date of death. Results: In total, four patients who had received radiotherapy had one implant failure each. Two mandibular implants failed secondary to osteoradionecrosis, and two maxillary implants did not survive as a result of failure to osseointegrate. The overall implant survival rates were 99.1% at three years and 98.1% at both 5 and 7 years. Conclusions: Although this data shows that implant failure rates are low, it highlights the difficulty in predicting which patients will be affected. Future studies involving larger cohorts are warranted to further analyze factors affecting outcomes.Keywords: oncology, dental implants, survival, restorative
Procedia PDF Downloads 2347994 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing
Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero
Abstract:
Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming
Procedia PDF Downloads 1427993 Performance Evaluation and Economic Analysis of Minimum Quantity Lubrication with Pressurized/Non-Pressurized Air and Nanofluid Mixture
Authors: M. Amrita, R. R. Srikant, A. V. Sita Rama Raju
Abstract:
Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nano cutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.Keywords: economic analysis, machining, minimum quantity lubrication, nanofluid
Procedia PDF Downloads 3807992 A Dual-Mode Infinite Horizon Predictive Control Algorithm for Load Tracking in PUSPATI TRIGA Reactor
Authors: Mohd Sabri Minhat, Nurul Adilla Mohd Subha
Abstract:
The PUSPATI TRIGA Reactor (RTP), Malaysia reached its first criticality on June 28, 1982, with power capacity 1MW thermal. The Feedback Control Algorithm (FCA) which is conventional Proportional-Integral (PI) controller, was used for present power control method to control fission process in RTP. It is important to ensure the core power always stable and follows load tracking within acceptable steady-state error and minimum settling time to reach steady-state power. At this time, the system could be considered not well-posed with power tracking performance. However, there is still potential to improve current performance by developing next generation of a novel design nuclear core power control. In this paper, the dual-mode predictions which are proposed in modelling Optimal Model Predictive Control (OMPC), is presented in a state-space model to control the core power. The model for core power control was based on mathematical models of the reactor core, OMPC, and control rods selection algorithm. The mathematical models of the reactor core were based on neutronic models, thermal hydraulic models, and reactivity models. The dual-mode prediction in OMPC for transient and terminal modes was based on the implementation of a Linear Quadratic Regulator (LQR) in designing the core power control. The combination of dual-mode prediction and Lyapunov which deal with summations in cost function over an infinite horizon is intended to eliminate some of the fundamental weaknesses related to MPC. This paper shows the behaviour of OMPC to deal with tracking, regulation problem, disturbance rejection and caters for parameter uncertainty. The comparison of both tracking and regulating performance is analysed between the conventional controller and OMPC by numerical simulations. In conclusion, the proposed OMPC has shown significant performance in load tracking and regulating core power for nuclear reactor with guarantee stabilising in the closed-loop.Keywords: core power control, dual-mode prediction, load tracking, optimal model predictive control
Procedia PDF Downloads 1627991 Rheolaser: Light Scattering Characterization of Viscoelastic Properties of Hair Cosmetics That Are Related to Performance and Stability of the Respective Colloidal Soft Materials
Authors: Heitor Oliveira, Gabriele De-Waal, Juergen Schmenger, Lynsey Godfrey, Tibor Kovacs
Abstract:
Rheolaser MASTER™ makes use of multiple scattering of light, caused by scattering objects in a continuous medium (such as droplets and particles in colloids), to characterize the viscoelasticity of soft materials. It offers an alternative to conventional rheometers to characterize viscoelasticity of products such as hair cosmetics. Up to six simultaneous measurements at controlled temperature can be carried out simultaneously (10-15 min), and the method requires only minor sample preparation work. Conversely to conventional rheometer based methods, no mechanical stress is applied to the material during the measurements. Therefore, the properties of the exact same sample can be monitored over time, like in aging and stability studies. We determined the elastic index (EI) of water/emulsion mixtures (1 ≤ fat alcohols (FA) ≤ 5 wt%) and emulsion/gel-network mixtures (8 ≤ FA ≤ 17 wt%) and compared with the elastic/sorage mudulus (G’) for the respective samples using a TA conventional rheometer with flat plates geometry. As expected, it was found that log(EI) vs log(G’) presents a linear behavior. Moreover, log(EI) increased in a linear fashion with solids level in the entire range of compositions (1 ≤ FA ≤ 17 wt%), while rheometer measurements were limited to samples down to 4 wt% solids level. Alternatively, a concentric cilinder geometry would be required for more diluted samples (FA > 4 wt%) and rheometer results from different sample holder geometries are not comparable. The plot of the rheolaser output parameters solid-liquid balance (SLB) vs EI were suitable to monitor product aging processes. These data could quantitatively describe some observations such as formation of lumps over aging time. Moreover, this method allowed to identify that the different specifications of a key raw material (RM < 0.4 wt%) in the respective gel-network (GN) product has minor impact on product viscoelastic properties and it is not consumer perceivable after a short aging time. Broadening of a RM spec range typically has a positive impact on cost savings. Last but not least, the photon path length (λ*)—proportional to droplet size and inversely proportional to volume fraction of scattering objects, accordingly to the Mie theory—and the EI were suitable to characterize product destabilization processes (e.g., coalescence and creaming) and to predict product stability about eight times faster than our standard methods. Using these parameters we could successfully identify formulation and process parameters that resulted in unstable products. In conclusion, Rheolaser allows quick and reliable characterization of viscoelastic properties of hair cosmetics that are related to their performance and stability. It operates in a broad range of product compositions and has applications spanning from the formulation of our hair cosmetics to fast release criteria in our production sites. Last but not least, this powerful tool has positive impact on R&D development time—faster delivery of new products to the market—and consequently on cost savings.Keywords: colloids, hair cosmetics, light scattering, performance and stability, soft materials, viscoelastic properties
Procedia PDF Downloads 1727990 Scar Removal Stretegy for Fingerprint Using Diffusion
Authors: Mohammad A. U. Khan, Tariq M. Khan, Yinan Kong
Abstract:
Fingerprint image enhancement is one of the most important step in an automatic fingerprint identification recognition (AFIS) system which directly affects the overall efficiency of AFIS. The conventional fingerprint enhancement like Gabor and Anisotropic filters do fill the gaps in ridge lines but they fail to tackle scar lines. To deal with this problem we are proposing a method for enhancing the ridges and valleys with scar so that true minutia points can be extracted with accuracy. Our results have shown an improved performance in terms of enhancement.Keywords: fingerprint image enhancement, removing noise, coherence, enhanced diffusion
Procedia PDF Downloads 5177989 Mapping of Alteration Zones in Mineral Rich Belt of South-East Rajasthan Using Remote Sensing Techniques
Authors: Mrinmoy Dhara, Vivek K. Sengar, Shovan L. Chattoraj, Soumiya Bhattacharjee
Abstract:
Remote sensing techniques have emerged as an asset for various geological studies. Satellite images obtained by different sensors contain plenty of information related to the terrain. Digital image processing further helps in customized ways for the prospecting of minerals. In this study, an attempt has been made to map the hydrothermally altered zones using multispectral and hyperspectral datasets of South East Rajasthan. Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER) and Hyperion (Level1R) dataset have been processed to generate different Band Ratio Composites (BRCs). For this study, ASTER derived BRCs were generated to delineate the alteration zones, gossans, abundant clays and host rocks. ASTER and Hyperion images were further processed to extract mineral end members and classified mineral maps have been produced using Spectral Angle Mapper (SAM) method. Results were validated with the geological map of the area which shows positive agreement with the image processing outputs. Thus, this study concludes that the band ratios and image processing in combination play significant role in demarcation of alteration zones which may provide pathfinders for mineral prospecting studies.Keywords: ASTER, hyperion, band ratios, alteration zones, SAM
Procedia PDF Downloads 2797988 Gadolinium-Based Polymer Nanostructures as Magnetic Resonance Imaging Contrast Agents
Authors: Franca De Sarno, Alfonso Maria Ponsiglione, Enza Torino
Abstract:
Recent advances in diagnostic imaging technology have significantly contributed to a better understanding of specific changes associated with diseases progression. Among different imaging modalities, Magnetic Resonance Imaging (MRI) represents a noninvasive medical diagnostic technique, which shows low sensitivity and long acquisition time and it can discriminate between healthy and diseased tissues by providing 3D data. In order to improve the enhancement of MRI signals, some imaging exams require intravenous administration of contrast agents (CAs). Recently, emerging research reports a progressive deposition of these drugs, in particular, gadolinium-based contrast agents (GBCAs), in the body many years after multiple MRI scans. These discoveries confirm the need to have a biocompatible system able to boost a clinical relevant Gd-chelate. To this aim, several approaches based on engineered nanostructures have been proposed to overcome the common limitations of conventional CAs, such as the insufficient signal-to-noise ratios due to relaxivity and poor safety profile. In particular, nanocarriers, labeling or loading with CAs, capable of carrying high payloads of CAs have been developed. Currently, there’s no a comprehensive understanding of the thermodynamic contributions enable of boosting the efficacy of conventional CAs by using biopolymers matrix. Thus, considering the importance of MRI in diagnosing diseases, here it is reported a successful example of the next generation of these drugs where the commercial gadolinium chelate is incorporate into a biopolymer nanostructure, formed by cross-linked hyaluronic acid (HA), with improved relaxation properties. In addition, they are highlighted the basic principles ruling biopolymer-CA interactions in the perspective of their influence on the relaxometric properties of the CA by adopting a multidisciplinary experimental approach. On the basis of these discoveries, it is clear that the main point consists in increasing the rigidification of readily-available Gd-CAs within the biopolymer matrix by controlling the water dynamics, the physicochemical interactions, and the polymer conformations. In the end, the acquired knowledge about polymer-CA systems has been applied to develop of Gd-based HA nanoparticles with enhanced relaxometric properties.Keywords: biopolymers, MRI, nanoparticles, contrast agent
Procedia PDF Downloads 1497987 Vehicular Speed Detection Camera System Using Video Stream
Authors: C. A. Anser Pasha
Abstract:
In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.Keywords: radar, image processing, detection, tracking, segmentation
Procedia PDF Downloads 4677986 The Application of Sensory Integration Techniques in Science Teaching Students with Autism
Authors: Joanna Estkowska
Abstract:
The Sensory Integration Method is aimed primarily at children with learning disabilities. It can also be used as a complementary method in treatment of children with cerebral palsy, autistic, mentally handicapped, blind and deaf. Autism is holistic development disorder that manifests itself in the specific functioning of a child. The most characteristic are: disorders in communication, difficulties in social relations, rigid patterns of behavior and impairment in sensory processing. In addition to these disorders may occur abnormal intellectual development, attention deficit disorders, perceptual disorders and others. This study was focused on the application sensory integration techniques in science education of autistic students. The lack of proper sensory integration causes problems with complicated processes such as motor coordination, movement planning, visual or auditory perception, speech, writing, reading or counting. Good functioning and cooperation of proprioceptive, tactile and vestibular sense affect the child’s mastery of skills that require coordination of both sides of the body and synchronization of the cerebral hemispheres. These include, for example, all sports activities, precise manual skills such writing, as well as, reading and counting skills. All this takes place in stages. Achieving skills from the first stage determines the development of fitness from the next level. Any deficit in the scope of the first three stages can affect the development of new skills. This ultimately reflects on the achievements at school and in further professional and personal life. After careful analysis symptoms from the emotional and social spheres appear to be secondary to deficits of sensory integration. During our research, the students gained knowledge and skills in the classroom of experience by learning biology, chemistry and physics with application sensory integration techniques. Sensory integration therapy aims to teach the child an adequate response to stimuli coming to him from both the outside world and the body. Thanks to properly selected exercises, a child can improve perception and interpretation skills, motor skills, coordination of movements, attention and concentration or self-awareness, as well as social and emotional functioning.Keywords: autism spectrum disorder, science education, sensory integration, special educational needs
Procedia PDF Downloads 1847985 Identifying the Determinants of the Shariah Non-Compliance Risk via Principal Axis Factoring
Authors: Muhammad Arzim Naim, Saiful Azhar Rosly, Mohamad Sahari Nordin
Abstract:
The objective of this study is to investigate the factors affecting the rise of Shariah non-compliance risk that can bring Islamic banks to succumb to monetary loss. Prior literatures have never analyzed such risk in details despite lots of it arguing on the validity of some Shariah compliance products. The Shariah non-compliance risk in this context is looking to the potentially failure of the facility to stand from the court test say that if the banks bring it to the court for compensation from the defaulted clients. The risk may also arise if the customers refuse to make the financing payments on the grounds of the validity of the contracts, for example, when relinquishing critical requirement of Islamic contract such as ownership, the risk that may lead the banks to suffer loss when the customer invalidate the contract through the court. The impact of Shariah non-compliance risk to Islamic banks is similar to that of legal risks faced by the conventional banks. Both resulted into monetary losses to the banks respectively. In conventional banking environment, losses can be in the forms of summons paid to the customers if they won the case. In banking environment, this normally can be in very huge amount. However, it is right to mention that for Islamic banks, the subsequent impact to them can be rigorously big because it will affect their reputation. If the customers do not perceive them to be Shariah compliant, they will take their money and bank it in other places. This paper provides new insights of risks faced by credit intensive Islamic banks by providing a new extension of knowledge with regards to the Shariah non-compliance risk by identifying its individual components that directly affecting the risk together with empirical evidences. Not limited to the Islamic banking fraternities, the regulators and policy makers should be able to use findings in this paper to evaluate the components of the Shariah non-compliance risk and make the necessary actions. The paper is written based on Malaysia’s Islamic banking practices which may not directly related to other jurisdictions. Even though the focuses of this study is directly towards to the Bay Bithaman Ajil or popularly known as BBA (i.e. sale with deferred payments) financing modality, the result from this study may be applicable to other Islamic financing vehicles.Keywords: Islamic banking, Islamic finance, Shariah Non-compliance risk, Bay Bithaman Ajil (BBA), principal axis factoring
Procedia PDF Downloads 3027984 Increased Stability of Rubber-Modified Asphalt Mixtures to Swelling, Expansion and Rebound Effect during Post-Compaction
Authors: Fernando Martinez Soto, Gaetano Di Mino
Abstract:
The application of rubber into bituminous mixtures requires attention and care during mixing and compaction. Rubber modifies the properties because it reacts in the internal structure of bitumen at high temperatures changing the performance of the mixture (interaction process of solvents with binder-rubber aggregate). The main change is the increasing of the viscosity and elasticity of the binder due to the larger sizes of the rubber particles by dry process but, this positive effect is counteracted by short mixing times, compared to wet technology, and due to the transport processes, curing time and post-compaction of the mixtures. Therefore, negative effects as swelling of rubber particles, rebounding effect of the specimens and thermal changes by different expansion of the structure inside the mixtures, can change the mechanical properties of the rubberized blends. Based on the dry technology, different asphalt-rubber binders using devulcanized or natural rubber (truck and bus tread rubber), have served to demonstrate these effects and how to solve them into two dense-gap graded rubber modified asphalt concrete mixes (RUMAC) to enhance the stability, workability and durability of the compacted samples by Superpave gyratory compactor method. This paper specifies the procedures developed in the Department of Civil Engineering of the University of Palermo during September 2016 to March 2017, for characterizing the post-compaction and mix-stability of the one conventional mixture (hot mix asphalt without rubber) and two gap-graded rubberized asphalt mixes according granulometry for rail sub-ballast layers with nominal size of Ø22.4mm of aggregates according European standard. Thus, the main purpose of this laboratory research is the application of ambient ground rubber from scrap tires processed at conventional temperature (20ºC) inside hot bituminous mixtures (160-220ºC) as a substitute for 1.5%, 2% and 3% by weight of the total aggregates (3.2%, 4.2% and, 6.2% respectively by volumetric part of the limestone aggregates of bulk density equal to 2.81g/cm³) considered, not as a part of the asphalt binder. The reference bituminous mixture was designed with 4% of binder and ± 3% of air voids, manufactured for a conventional bitumen B50/70 at 160ºC-145ºC mix-compaction temperatures to guarantee the workability of the mixes. The proportions of rubber proposed are #60-40% for mixtures with 1.5 to 2% of rubber and, #20-80% for mixture with 3% of rubber (as example, a 60% of Ø0.4-2mm and 40% of Ø2-4mm). The temperature of the asphalt cement is between 160-180 ºC for mixing and 145-160 ºC for compaction, according to the optimal values for viscosity using Brookfield viscometer and 'ring and ball' - penetration tests. These crumb rubber particles act as a rubber-aggregate into the mixture, varying sizes between 0.4mm to 2mm in a first fraction, and 2-4mm as second proportion. Ambient ground rubber with a specific gravity of 1.154g/cm³ is used. The rubber is free of loose fabric, wire, and other contaminants. It was found optimal results in real beams and cylindrical specimens with each HMA mixture reducing the swelling effect. Different factors as temperature, particle sizes of rubber, number of cycles and pressures of compaction that affect the interaction process are explained.Keywords: crumb-rubber, gyratory compactor, rebounding effect, superpave mix-design, swelling, sub-ballast railway
Procedia PDF Downloads 2437983 Laser Paint Stripping on Large Zones on AA 2024 Based Substrates
Authors: Selen Unaldi, Emmanuel Richaud, Matthieu Gervais, Laurent Berthe
Abstract:
Aircrafts are painted with several layers to guarantee their protection from external attacks. For aluminum AA 2024-T3 (metallic structural part of the plane), a protective primer is applied to ensure its corrosion protection. On top of this layer, the top coat is applied for aesthetic aspects. During the lifetime of an aircraft, top coat stripping has an essential role which should be operated as an average of every four years. However, since conventional stripping processes create hazardous disposals and need long hours of labor work, alternative methods have been investigated. Amongst them, laser stripping appears as one of the most promising techniques not only because of the reasons mentioned above but also its controllable and monitorable aspects. The application of a laser beam from the coated side provides stripping, but the depth of the process should be well controlled in order to prevent damage to a substrate and the anticorrosion primer. Apart from that, thermal effects should be taken into account on the painted layers. As an alternative, we worked on developing a process that includes the usage of shock wave propagation to create the stripping via mechanical effects with the application of the beam from the substrate side (back face) of the samples. Laser stripping was applied on thickness-specified samples with a thickness deviation of 10-20%. First, the stripping threshold is determined as a function of power density which is the first flight off of the top coats. After obtaining threshold values, the same power densities were applied to specimens to create large stripping zones with a spot overlap of 10-40%. Layer characteristics were determined on specimens in terms of physicochemical properties and thickness range both before and after laser stripping in order to validate the substrate material health and coating properties. The substrate health is monitored by measuring the roughness of the laser-impacted zones and free surface energy tests (both before and after laser stripping). Also, Hugoniot Elastic Limit (HEL) is determined from VISAR diagnostic on AA 2024-T3 substrates (for the back face surface deformations). In addition, the coating properties are investigated as a function of adhesion levels and anticorrosion properties (neutral salt spray test). The influence of polyurethane top-coat thickness is studied in order to verify the laser stripping process window for industrial aircraft applications.Keywords: aircraft coatings, laser stripping, laser adhesion tests, epoxy, polyurethane
Procedia PDF Downloads 78