Search results for: helicopter techniques
5866 Evaluation and Assessment of Bioinformatics Methods and Their Applications
Authors: Fatemeh Nokhodchi Bonab
Abstract:
Bioinformatics, in its broad sense, involves application of computer processes to solve biological problems. A wide range of computational tools are needed to effectively and efficiently process large amounts of data being generated as a result of recent technological innovations in biology and medicine. A number of computational tools have been developed or adapted to deal with the experimental riches of complex and multivariate data and transition from data collection to information or knowledge. These bioinformatics tools are being evaluated and applied in various medical areas including early detection, risk assessment, classification, and prognosis of cancer. The goal of these efforts is to develop and identify bioinformatics methods with optimal sensitivity, specificity, and predictive capabilities. The recent flood of data from genome sequences and functional genomics has given rise to new field, bioinformatics, which combines elements of biology and computer science. Bioinformatics is conceptualizing biology in terms of macromolecules (in the sense of physical-chemistry) and then applying "informatics" techniques (derived from disciplines such as applied maths, computer science, and statistics) to understand and organize the information associated with these molecules, on a large-scale. Here we propose a definition for this new field and review some of the research that is being pursued, particularly in relation to transcriptional regulatory systems.Keywords: methods, applications, transcriptional regulatory systems, techniques
Procedia PDF Downloads 1275865 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation
Procedia PDF Downloads 3255864 Detecting Earnings Management via Statistical and Neural Networks Techniques
Authors: Mohammad Namazi, Mohammad Sadeghzadeh Maharluie
Abstract:
Predicting earnings management is vital for the capital market participants, financial analysts and managers. The aim of this research is attempting to respond to this query: Is there a significant difference between the regression model and neural networks’ models in predicting earnings management, and which one leads to a superior prediction of it? In approaching this question, a Linear Regression (LR) model was compared with two neural networks including Multi-Layer Perceptron (MLP), and Generalized Regression Neural Network (GRNN). The population of this study includes 94 listed companies in Tehran Stock Exchange (TSE) market from 2003 to 2011. After the results of all models were acquired, ANOVA was exerted to test the hypotheses. In general, the summary of statistical results showed that the precision of GRNN did not exhibit a significant difference in comparison with MLP. In addition, the mean square error of the MLP and GRNN showed a significant difference with the multi variable LR model. These findings support the notion of nonlinear behavior of the earnings management. Therefore, it is more appropriate for capital market participants to analyze earnings management based upon neural networks techniques, and not to adopt linear regression models.Keywords: earnings management, generalized linear regression, neural networks multi-layer perceptron, Tehran stock exchange
Procedia PDF Downloads 4215863 Performance Enhancement of Autopart Manufacturing Industry Using Lean Manufacturing Strategies: A Case Study
Authors: Raman Kumar, Jasgurpreet Singh Chohan, Chander Shekhar Verma
Abstract:
Today, the manufacturing industries respond rapidly to new demands and compete in this continuously changing environment, thus seeking out new methods allowing them to remain competitive and flexible simultaneously. The aim of the manufacturing organizations is to reduce manufacturing costs and wastes through system simplification, organizational potential, and proper infrastructural planning by using modern techniques like lean manufacturing. In India, large number of medium and large scale manufacturing industries has successfully implemented lean manufacturing techniques. Keeping in view the above-mentioned facts, different tools will be involved in the successful implementation of the lean approach. The present work is focused on the auto part manufacturing industry to improve the performance of the recliner assembly line. There is a number of lean manufacturing tools available, but the experience and complete knowledge of manufacturing processes are required to select an appropriate tool for a specific process. Fishbone diagrams (scrap, inventory, and waiting) have been drawn to identify the root cause of different. Effect of cycle time reduction on scrap and inventory is analyzed thoroughly in the case company. Results have shown that there is a decrease in inventory cost by 7 percent after the successful implementation of the lean tool.Keywords: lean tool, fish-bone diagram, cycle time reduction, case study
Procedia PDF Downloads 1275862 Lip Localization Technique for Myanmar Consonants Recognition Based on Lip Movements
Authors: Thein Thein, Kalyar Myo San
Abstract:
Lip reading system is one of the different supportive technologies for hearing impaired, or elderly people or non-native speakers. For normal hearing persons in noisy environments or in conditions where the audio signal is not available, lip reading techniques can be used to increase their understanding of spoken language. Hearing impaired persons have used lip reading techniques as important tools to find out what was said by other people without hearing voice. Thus, visual speech information is important and become active research area. Using visual information from lip movements can improve the accuracy and robustness of a speech recognition system and the need for lip reading system is ever increasing for every language. However, the recognition of lip movement is a difficult task because of the region of interest (ROI) is nonlinear and noisy. Therefore, this paper proposes method to detect the accurate lips shape and to localize lip movement towards automatic lip tracking by using the combination of Otsu global thresholding technique and Moore Neighborhood Tracing Algorithm. Proposed method shows how accurate lip localization and tracking which is useful for speech recognition. In this work of study and experiments will be carried out the automatic lip localizing the lip shape for Myanmar consonants using the only visual information from lip movements which is useful for visual speech of Myanmar languages.Keywords: lip reading, lip localization, lip tracking, Moore neighborhood tracing algorithm
Procedia PDF Downloads 3525861 Score to Screen: A Study of Emotional and Dramatic Elevation in Films Through Mychael Danna’s Scores
Authors: Namrata Hangala
Abstract:
This paper dives into the powerful intersection between film music and storytelling and how it elevates the visuals while primarily focusing on Mychael Danna’s compositions for the study. Danna, an Academy Award-winning composer, is known for his brilliant ability to mix non-Western and culturally rich instruments with minimalist techniques. This unique approach forms the backbone of the analysis here. We take a close look at key scenes from films like Life of Pi, Moneyball, The Good Dinosaur, and Little Miss Sunshine, where Danna’s music plays a crucial role in shaping the story. By breaking down how these scores impact the scenes emotionally and dramatically, we can see how his music becomes part of the narrative itself. The paper blends different approaches to get to the heart of this scene-by-scene breakdowns, music theory, audience survey, and even insights directly from Danna. It discusses how his scores deepen the emotional connection and give more weight to the visual storytelling. The research also dives into the use of leitmotifs, cultural authenticity, and how his music can psychologically impact the viewer, making the story even more powerful. This study reveals how film music, especially Danna’s, doesn’t just sit in the background. It’s often the driving force behind the emotional and narrative core of the film, anchoring the visuals and shaping the way the viewers experience the story.Keywords: ethnomusicology, psychological impact, film scores, cultural music, compositional techniques, emotional storytelling
Procedia PDF Downloads 115860 Reinforcement Learning Optimization: Unraveling Trends and Advancements in Metaheuristic Algorithms
Authors: Rahul Paul, Kedar Nath Das
Abstract:
The field of machine learning (ML) is experiencing rapid development, resulting in a multitude of theoretical advancements and extensive practical implementations across various disciplines. The objective of ML is to facilitate the ability of machines to perform cognitive tasks by leveraging knowledge gained from prior experiences and effectively addressing complex problems, even in situations that deviate from previously encountered instances. Reinforcement Learning (RL) has emerged as a prominent subfield within ML and has gained considerable attention in recent times from researchers. This surge in interest can be attributed to the practical applications of RL, the increasing availability of data, and the rapid advancements in computing power. At the same time, optimization algorithms play a pivotal role in the field of ML and have attracted considerable interest from researchers. A multitude of proposals have been put forth to address optimization problems or improve optimization techniques within the domain of ML. The necessity of a thorough examination and implementation of optimization algorithms within the context of ML is of utmost importance in order to provide guidance for the advancement of research in both optimization and ML. This article provides a comprehensive overview of the application of metaheuristic evolutionary optimization algorithms in conjunction with RL to address a diverse range of scientific challenges. Furthermore, this article delves into the various challenges and unresolved issues pertaining to the optimization of RL models.Keywords: machine learning, reinforcement learning, loss function, evolutionary optimization techniques
Procedia PDF Downloads 755859 Crowdsourced Economic Valuation of the Recreational Benefits of Constructed Wetlands
Authors: Andrea Ghermandi
Abstract:
Constructed wetlands have long been recognized as sources of ancillary benefits such as support for recreational activities. To date, there is a lack of quantitative understanding of the extent and welfare impact of such benefits. Here, it is shown how geotagged, passively crowdsourced data from online social networks (e.g., Flickr and Panoramio) and Geographic Information Systems (GIS) techniques can: (1) be used to infer annual recreational visits to 273 engineered wetlands worldwide; and (2) be integrated with non-market economic valuation techniques (e.g., travel cost method) to infer the monetary value of recreation in these systems. Counts of social media photo-user-days are highly correlated with the number of observed visits in 62 engineered wetlands worldwide (Pearson’s r = 0.811; p-value < 0.001). The estimated, mean willingness to pay for access to 115 wetlands ranges between $5.3 and $374. In 50% of the investigated wetlands providing polishing treatment to advanced municipal wastewater, the present value of such benefits exceeds that of the capital, operation and maintenance costs (lifetime = 45 years; discount rate = 6%), indicating that such systems are sources of net societal benefits even before factoring in benefits derived from water quality improvement and storage. Based on the above results, it is argued that recreational benefits should be taken into account in the design and management of constructed wetlands, as well as when such green infrastructure systems are compared with conventional wastewater treatment solutions.Keywords: constructed wetlands, cultural ecosystem services, ecological engineering, social media
Procedia PDF Downloads 1295858 The Type II Immune Response in Acute and Chronic Pancreatitis Mediated by STAT6 in Murine
Authors: Hager Elsheikh
Abstract:
Context: Pancreatitis is a condition characterized by inflammation in the pancreas, which can lead to serious complications if untreated. Both acute and chronic pancreatitis are associated with immune reactions and fibrosis, which further damage the pancreas. The type 2 immune response, primarily driven by alternative activated macrophages (AAMs), plays a significant role in the development of fibrosis. The IL-4/STAT6 pathway is a crucial signaling pathway for the activation of M2 macrophages. Pancreatic fibrosis is induced by dysregulated inflammatory responses and can result in the autodigestion and necrosis of pancreatic acinar cells. Research Aim: The aim of this study is to investigate the impact of STAT6, a crucial molecule in the IL-4/STAT6 pathway, on the severity and development of fibrosis during acute and chronic pancreatitis. The research also aims to understand the influence of the JAK/STAT6 signaling pathway on the balance between fibrosis and regeneration in the presence of different macrophage populations. Methodology: The research utilizes murine models of acute and chronic pancreatitis induced by cerulean injection. Animal models will be employed to study the effect of STAT6 knockout on disease severity and fibrosis. Isolation of acinar cells and cell culture techniques will be used to assess the impact of different macrophage populations on wound healing and regeneration. Various techniques such as PCR, histology, immunofluorescence, and transcriptomics will be employed to analyze the tissues and cells. Findings: The research aims to provide insights into the mechanisms underlying tissue fibrosis and wound healing during acute and chronic pancreatitis. By investigating the influence of the JAK/STAT6 signaling pathway and different macrophage populations, the study aims to understand their impact on tissue fibrosis, disease severity, and pancreatic regeneration. Theoretical Importance: This research contributes to our understanding of the role of specific signaling pathways, macrophage polarization, and the type 2 immune response in pancreatitis. It provides insights into the molecular mechanisms underlying tissue fibrosis and the potential for targeted therapies. Data Collection and Analysis Procedures: Data will be collected through the use of murine models, isolation and culture of acinar cells, and various experimental techniques such as PCR, histology, immunofluorescence, and transcriptomics. Data will be analyzed using appropriate statistical methods and techniques, and the findings will be interpreted in the context of the research objectives. Conclusion: By investigating the mechanisms of tissue fibrosis and wound healing during acute and chronic pancreatitis, this research aims to enhance our understanding of the disease progression and potential therapeutic targets. The findings have theoretical importance in expanding our knowledge of pancreatic fibrosis and the role of macrophage polarization in the context of the type 2 immune response.Keywords: immunity in chronic diseases, pancreatitis, macrophages, immune response
Procedia PDF Downloads 345857 A Risk Management Framework for Selling a Mega Power Plant Project in a New Market
Authors: Negar Ganjouhaghighi, Amirali Dolatshahi
Abstract:
The origin of most risks of a mega project usually takes place in the phases before closing the contract. As a practical point of view, using project risk management techniques for preparing a proposal is not a total solution for managing the risks of a contract. The objective of this paper is to cover all those activities associated with risk management of a mega project sale’s processes; from entrance to a new market to awarding activities and the review of contract performance. In this study, the risk management happens in six consecutive steps that are divided into three distinct but interdependent phases upstream of the award of the contract: pre-tendering, tendering and closing. In the first step, by preparing standard market risk report, risks of the new market are identified. The next step is the bid or no bid decision making based on the previous gathered data. During the next three steps in tendering phase, project risk management techniques are applied for determining how much contingency reserve must be added or reduced to the estimated cost in order to put the residual risk to an acceptable level. Finally, the last step which happens in closing phase would be an overview of the project risks and final clarification of residual risks. The sales experience of more than 20,000 MW turn-key power plant projects alongside this framework, are used to develop a software that assists the sales team to have a better project risk management.Keywords: project marketing, risk management, tendering, project management, turn-key projects
Procedia PDF Downloads 3295856 Knowledge Management in the Interactive Portal for Decision Makers on InKOM Example
Authors: K. Marciniak, M. Owoc
Abstract:
Managers as decision-makers present in different sectors should be supported in efficient and more and more sophisticated way. There are huge number of software tools developed for such users starting from simple registering data from business area – typical for operational level of management – up to intelligent techniques with delivering knowledge - for tactical and strategic levels of management. There is a big challenge for software developers to create intelligent management dashboards allowing to support different decisions. In more advanced solutions there is even an option for selection of intelligent techniques useful for managers in particular decision-making phase in order to deliver valid knowledge-base. Such a tool (called Intelligent Dashboard for SME Managers–InKOM) is prepared in the Business Intelligent framework of Teta products. The aim of the paper is to present solutions assumed for InKOM concerning on management of stored knowledge bases offering for business managers. The paper is managed as follows. After short introduction concerning research context the discussed supporting managers via information systems the InKOM platform is presented. In the crucial part of paper a process of knowledge transformation and validation is demonstrated. We will focus on potential and real ways of knowledge-bases acquiring, storing and validation. It allows for formulation conclusions interesting from knowledge engineering point of view.Keywords: business intelligence, decision support systems, knowledge management, knowledge transformation, knowledge validation, managerial systems
Procedia PDF Downloads 5135855 Performance Evaluation of Wideband Code Division Multiplication Network
Authors: Osama Abdallah Mohammed Enan, Amin Babiker A/Nabi Mustafa
Abstract:
The aim of this study is to evaluate and analyze different parameters of WCDMA (wideband code division multiplication). Moreover, this study also incorporates brief yet throughout analysis of WCDMA’s components as well as its internal architecture. This study also examines different power controls. These power controls may include open loop power control, closed or inner group loop power control and outer loop power control. Different handover techniques or methods of WCDMA are also illustrated in this study. These handovers may include hard handover, inter system handover and soft and softer handover. Different duplexing techniques are also described in the paper. This study has also presented an idea about different parameters of WCDMA that leads the system towards QoS issues. This may help the operator in designing and developing adequate network configuration. In addition to this, the study has also investigated various parameters including Bit Energy per Noise Spectral Density (Eb/No), Noise rise, and Bit Error Rate (BER). After simulating these parameters, using MATLAB environment, it was investigated that, for a given Eb/No value the system capacity increase by increasing the reuse factor. Besides that, it was also analyzed that, noise rise is decreasing for lower data rates and for lower interference levels. Finally, it was examined that, BER increase by using one type of modulation technique than using other type of modulation technique.Keywords: duplexing, handover, loop power control, WCDMA
Procedia PDF Downloads 2155854 Effect of Weed Control and Different Plant Densities the Yield and Quality of Safflower (Carthamus tinctorius L.)
Authors: Hasan Dalgic, Fikret Akinerdem
Abstract:
This trial was made to determine effect of different plant density and weed control on yield and quality of winter sowing safflower (Carthamus tinctorius L.) in Selcuk University, Agricultural Faculty trial fields and the effective substance of Trifluran was used as herbicide. Field trial was made during the vegetation period of 2009-2010 with three replications according to 'Split Plots in Randomized Blocks' design. The weed control techniques were made on main plots and row distances was set up on sub-plots. The trial subjects were consisting from three weed control techniques as fallowing: herbicide application (Trifluran), hoeing and control beside the row distances of 15 cm and 30 cm. The results were ranged between 59.0-76.73 cm in plant height, 40.00-47.07 cm in first branch height, 5.00-7.20 in number of branch per plant, 6.00-14.73 number of head per plant, 19.57-21.87 mm in head diameter, 2125.0-3968.3 kg ha-1 in seed yield, 27.10-28.08 % in crude oil rate and 531.7-1070.3 kg ha-1. According to the results, Remzibey safflower cultivar showed the highest seed yield on 30 cm of row distance and herbicide application by means of the direct effects of plant height, first branch height, number of branch per plant, number of head per plant, table diameter, crude oil rate and crude oil yield.Keywords: safflower, herbicide, row spacing, seed yield, oil ratio, oil yield
Procedia PDF Downloads 3335853 Evolving Convolutional Filter Using Genetic Algorithm for Image Classification
Authors: Rujia Chen, Ajit Narayanan
Abstract:
Convolutional neural networks (CNN), as typically applied in deep learning, use layer-wise backpropagation (BP) to construct filters and kernels for feature extraction. Such filters are 2D or 3D groups of weights for constructing feature maps at subsequent layers of the CNN and are shared across the entire input. BP as a gradient descent algorithm has well-known problems of getting stuck at local optima. The use of genetic algorithms (GAs) for evolving weights between layers of standard artificial neural networks (ANNs) is a well-established area of neuroevolution. In particular, the use of crossover techniques when optimizing weights can help to overcome problems of local optima. However, the application of GAs for evolving the weights of filters and kernels in CNNs is not yet an established area of neuroevolution. In this paper, a GA-based filter development algorithm is proposed. The results of the proof-of-concept experiments described in this paper show the proposed GA algorithm can find filter weights through evolutionary techniques rather than BP learning. For some simple classification tasks like geometric shape recognition, the proposed algorithm can achieve 100% accuracy. The results for MNIST classification, while not as good as possible through standard filter learning through BP, show that filter and kernel evolution warrants further investigation as a new subarea of neuroevolution for deep architectures.Keywords: neuroevolution, convolutional neural network, genetic algorithm, filters, kernels
Procedia PDF Downloads 1865852 Development of in vitro Fertilization and Emerging Legal Issues
Authors: Malik Imtiaz Ahmad
Abstract:
The development of In Vitro Fertilization (IVF) has revolutionized the field of reproductive medicine, offering hope to myriad individuals and couples facing infertility issues. IVF, a process involving the fertilization of eggs with sperm outside the body, has evolved over decades from an experimental procedure to a mainstream medical practice. The study sought to understand the evolution of IVF from its early stages to its present status as a groundbreaking fertility treatment. It also aimed to analyze the legal complexities surrounding IVF, including issues like embryo ownership, surrogacy agreements, and custody disputes. This research focused on the multidisciplinary approach involving both medical and legal fields. It aimed to explore the historical evolution of IVF, its techniques, and legal challenges concerning family law, health law, and privacy policies it has given rise to in modern times. This research aimed to provide insights into the intersection of medical technology and the law, offering valuable knowledge for policymakers, legal experts, and individuals involved in IVF. The study utilized various methods, including a thorough literature review, a historical analysis of IVF’s evolution, an examination of legal cases, and a review of emerging regulations. These approaches aimed to provide a comprehensive understanding of IVF and its modern legal issues, facilitating a holistic exploration of the subject matter.Keywords: in vitro fertilization development, IVF techniques evolution, legal issues in IVF, IVF legal frameworks, ethical dilemmas in IVF
Procedia PDF Downloads 355851 Application of GIS Techniques for Analysing Urban Built-Up Growth of Class-I Indian Cities: A Case Study of Surat
Authors: Purba Biswas, Priyanka Dey
Abstract:
Worldwide rapid urbanisation has accelerated city expansion in both developed and developing nations. This unprecedented urbanisation trend due to the increasing population and economic growth has caused challenges for the decision-makers in city planning and urban management. Metropolitan cities, class-I towns, and major urban centres undergo a continuous process of evolution due to interaction between socio-cultural and economic attributes. This constant evolution leads to urban expansion in all directions. Understanding the patterns and dynamics of urban built-up growth is crucial for policymakers, urban planners, and researchers, as it aids in resource management, decision-making, and the development of sustainable strategies to address the complexities associated with rapid urbanisation. Identifying spatio-temporal patterns of urban growth has emerged as a crucial challenge in monitoring and assessing present and future trends in urban development. Analysing urban growth patterns and tracking changes in land use is an important aspect of urban studies. This study analyses spatio-temporal urban transformations and land-use and land cover changes using remote sensing and GIS techniques. Built-up growth analysis has been done for the city of Surat as a case example, using the GIS tools of NDBI and GIS models of the Built-up Urban Density Index and Shannon Entropy Index to identify trends and the geographical direction of transformation from 2005 to 2020. Surat is one of the fastest-growing urban centres in both the state and the nation, ranking as the 4th fastest-growing city globally. This study analyses the dynamics of urban built-up area transformations both zone-wise and geographical direction-wise, in which their trend, rate, and magnitude were calculated for the period of 15 years. This study also highlights the need for analysing and monitoring the urban growth pattern of class-I cities in India using spatio-temporal and quantitative techniques like GIS for improved urban management.Keywords: urban expansion, built-up, geographic information system, remote sensing, Shannon’s entropy
Procedia PDF Downloads 725850 Automatic Motion Trajectory Analysis for Dual Human Interaction Using Video Sequences
Authors: Yuan-Hsiang Chang, Pin-Chi Lin, Li-Der Jeng
Abstract:
Advance in techniques of image and video processing has enabled the development of intelligent video surveillance systems. This study was aimed to automatically detect moving human objects and to analyze events of dual human interaction in a surveillance scene. Our system was developed in four major steps: image preprocessing, human object detection, human object tracking, and motion trajectory analysis. The adaptive background subtraction and image processing techniques were used to detect and track moving human objects. To solve the occlusion problem during the interaction, the Kalman filter was used to retain a complete trajectory for each human object. Finally, the motion trajectory analysis was developed to distinguish between the interaction and non-interaction events based on derivatives of trajectories related to the speed of the moving objects. Using a database of 60 video sequences, our system could achieve the classification accuracy of 80% in interaction events and 95% in non-interaction events, respectively. In summary, we have explored the idea to investigate a system for the automatic classification of events for interaction and non-interaction events using surveillance cameras. Ultimately, this system could be incorporated in an intelligent surveillance system for the detection and/or classification of abnormal or criminal events (e.g., theft, snatch, fighting, etc.).Keywords: motion detection, motion tracking, trajectory analysis, video surveillance
Procedia PDF Downloads 5485849 Design of an Ensemble Learning Behavior Anomaly Detection Framework
Authors: Abdoulaye Diop, Nahid Emad, Thierry Winter, Mohamed Hilia
Abstract:
Data assets protection is a crucial issue in the cybersecurity field. Companies use logical access control tools to vault their information assets and protect them against external threats, but they lack solutions to counter insider threats. Nowadays, insider threats are the most significant concern of security analysts. They are mainly individuals with legitimate access to companies information systems, which use their rights with malicious intents. In several fields, behavior anomaly detection is the method used by cyber specialists to counter the threats of user malicious activities effectively. In this paper, we present the step toward the construction of a user and entity behavior analysis framework by proposing a behavior anomaly detection model. This model combines machine learning classification techniques and graph-based methods, relying on linear algebra and parallel computing techniques. We show the utility of an ensemble learning approach in this context. We present some detection methods tests results on an representative access control dataset. The use of some explored classifiers gives results up to 99% of accuracy.Keywords: cybersecurity, data protection, access control, insider threat, user behavior analysis, ensemble learning, high performance computing
Procedia PDF Downloads 1285848 Segmentation of Liver Using Random Forest Classifier
Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir
Abstract:
Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.Keywords: CT images, image validation, random forest, segmentation
Procedia PDF Downloads 3135847 Developing Cause-effect Model of Urban Resilience versus Flood in Karaj City using TOPSIS and Shannon Entropy Techniques
Authors: Mohammad Saber Eslamlou, Manouchehr Tabibian, Mahta Mirmoghtadaei
Abstract:
The history of urban development and the increasing complexities of urban life have long been intertwined with different natural and man-made disasters. Sometimes, these unpleasant events have destroyed the cities forever. The growth of the urban population and the increase of social and economic resources in the cities increased the importance of developing a holistic approach to dealing with unknown urban disasters. As a result, the interest in resilience has increased in most of the scientific fields, and the urban planning literature has been enriched with the studies of the social, economic, infrastructural, and physical abilities of the cities. In this regard, different conceptual frameworks and patterns have been developed focusing on dimensions of resilience and different kinds of disasters. As the most frequent and likely natural disaster in Iran is flooding, the present study aims to develop a cause-effect model of urban resilience against flood in Karaj City. In this theoretical study, desk research and documentary studies were used to find the elements and dimensions of urban resilience. In this regard, 6 dimensions and 32 elements were found for urban resilience and a questionnaire was made by considering the requirements of TOPSIS techniques (pairwise comparison). The sample of the research consisted of 10 participants who were faculty members, academicians, board members of research centers, managers of the Ministry of Road and Urban Development, board members of New Towns Development Company, experts, and practitioners of consulting companies who had scientific and research backgrounds. The gathered data in this survey were analyzed using TOPSIS and Shannon Entropy techniques. The results show that Infrastructure/Physical, Social, Organizational/ Institutional, Structural/Physical, Economic, and Environmental dimensions are the most effective factors in urban resilience against floods in Karaj, respectively. Finally, a comprehensive model and a systematic framework of factors that affect the urban resilience of Karaj against floods was developed. This cause – effect model shows how different factors are related and influence each other, based on their connected structure and preferences.Keywords: urban resilience, TOPSIS, Shannon entropy, cause-effect model of resilience, flood
Procedia PDF Downloads 585846 GC and GCxGC-MS Composition of Volatile Compounds from Cuminum cyminum and Carum carvi by Using Techniques Assisted by Microwaves
Authors: F. Benkaci-Ali, R. Mékaoui, G. Scholl, G. Eppe
Abstract:
The new methods as accelerated steam distillation assisted by microwave (ASDAM) is a combination of microwave heating and steam distillation, performed at atmospheric pressure at very short extraction time. Isolation and concentration of volatile compounds are performed by a single stage. (ASDAM) has been compared with (ASDAM) with cryogrinding of seeds (CG) and a conventional technique, hydrodistillation assisted by microwave (HDAM), hydro-distillation (HD) for the extraction of essential oil from aromatic herb as caraway and cumin seeds. The essential oils extracted by (ASDAM) for 1 min were quantitatively (yield) and qualitatively (aromatic profile) no similar to those obtained by ASDAM-CG (1 min) and HD (for 3 h). The accelerated microwave extraction with cryogrinding inhibits numerous enzymatic reactions as hydrolysis of oils. Microwave radiations constitute the adequate mean for the extraction operations from the yields and high content in major component majority point view, and allow to minimise considerably the energy consumption, but especially heating time too, which is one of essential parameters of artifacts formation. The ASDAM and ASDAM-CG are green techniques and yields an essential oil with higher amounts of more valuable oxygenated compounds comparable to the biosynthesis compounds, and allows substantial savings of costs, in terms of time, energy and plant material.Keywords: microwave, steam distillation, caraway, cumin, cryogrinding, GC-MS, GCxGC-MS
Procedia PDF Downloads 2585845 A Compact Ultra-Wide Band Antenna with C-Shaped Slot for WLAN Notching
Authors: Maryam Rasool, Farhan Munir, Fahad Nawaz, Saad Ahmad
Abstract:
A patch antenna operating in the Ultra-Wide Band of frequency (3.1 GHz – 10.6 GHz) is designed with enhanced security from interference from other applications by incorporating the notching technique. Patch antennas in the Ultra-Wide Band are becoming widely famous due to their low power, light weight and high data rate capability. Micro strip patch antenna’s patch can be altered to increase its bandwidth and introduce UWB character in it. The designed antenna is a patch antenna consisting of a conductive sheet of metal mounted over a large sheet of metal called the ground plane with a substrate separating the two. Notched bands are public safety WLAN, WLAN and FSS. Different techniques used to implement the UWB antenna were individually implemented and there results were examined. V shaped patch was then chosen and modified to an arrow shaped patch to give the optimized results operating on the entire UWB region with considerable return loss. The frequency notch prevents the operation of the antenna at a particular range of frequency, hence minimizing interference from other systems. There are countless techniques for introducing the notch but we have used inverted C-shaped slots in the UWB patch to get the notch characteristics as output and also wavelength resonators to introduce notch in UWB band. The designed antenna is simulated in High Frequency Structural Simulator (HFSS) 13.0 by Ansoft.Keywords: HFSS, Notch, UWB, WLAN
Procedia PDF Downloads 4165844 Application of Deep Learning Algorithms in Agriculture: Early Detection of Crop Diseases
Authors: Manaranjan Pradhan, Shailaja Grover, U. Dinesh Kumar
Abstract:
Farming community in India, as well as other parts of the world, is one of the highly stressed communities due to reasons such as increasing input costs (cost of seeds, fertilizers, pesticide), droughts, reduced revenue leading to farmer suicides. Lack of integrated farm advisory system in India adds to the farmers problems. Farmers need right information during the early stages of crop’s lifecycle to prevent damage and loss in revenue. In this paper, we use deep learning techniques to develop an early warning system for detection of crop diseases using images taken by farmers using their smart phone. The research work leads to building a smart assistant using analytics and big data which could help the farmers with early diagnosis of the crop diseases and corrective actions. The classical approach for crop disease management has been to identify diseases at crop level. Recently, ImageNet Classification using the convolutional neural network (CNN) has been successfully used to identify diseases at individual plant level. Our model uses convolution filters, max pooling, dense layers and dropouts (to avoid overfitting). The models are built for binary classification (healthy or not healthy) and multi class classification (identifying which disease). Transfer learning is used to modify the weights of parameters learnt through ImageNet dataset and apply them on crop diseases, which reduces number of epochs to learn. One shot learning is used to learn from very few images, while data augmentation techniques are used to improve accuracy with images taken from farms by using techniques such as rotation, zoom, shift and blurred images. Models built using combination of these techniques are more robust for deploying in the real world. Our model is validated using tomato crop. In India, tomato is affected by 10 different diseases. Our model achieves an accuracy of more than 95% in correctly classifying the diseases. The main contribution of our research is to create a personal assistant for farmers for managing plant disease, although the model was validated using tomato crop, it can be easily extended to other crops. The advancement of technology in computing and availability of large data has made possible the success of deep learning applications in computer vision, natural language processing, image recognition, etc. With these robust models and huge smartphone penetration, feasibility of implementation of these models is high resulting in timely advise to the farmers and thus increasing the farmers' income and reducing the input costs.Keywords: analytics in agriculture, CNN, crop disease detection, data augmentation, image recognition, one shot learning, transfer learning
Procedia PDF Downloads 1195843 Methodology: A Review in Modelling and Predictability of Embankment in Soft Ground
Authors: Bhim Kumar Dahal
Abstract:
Transportation network development in the developing country is in rapid pace. The majority of the network belongs to railway and expressway which passes through diverse topography, landform and geological conditions despite the avoidance principle during route selection. Construction of such networks demand many low to high embankment which required improvement in the foundation soil. This paper is mainly focused on the various advanced ground improvement techniques used to improve the soft soil, modelling approach and its predictability for embankments construction. The ground improvement techniques can be broadly classified in to three groups i.e. densification group, drainage and consolidation group and reinforcement group which are discussed with some case studies. Various methods were used in modelling of the embankments from simple 1-dimensional to complex 3-dimensional model using variety of constitutive models. However, the reliability of the predictions is not found systematically improved with the level of sophistication. And sometimes the predictions are deviated more than 60% to the monitored value besides using same level of erudition. This deviation is found mainly due to the selection of constitutive model, assumptions made during different stages, deviation in the selection of model parameters and simplification during physical modelling of the ground condition. This deviation can be reduced by using optimization process, optimization tools and sensitivity analysis of the model parameters which will guide to select the appropriate model parameters.Keywords: cement, improvement, physical properties, strength
Procedia PDF Downloads 1745842 Noise Barrier Technique as a Way to Improve the Sonic Urban Environment along Existing Roadways Assessment: El-Gish Road Street, Alexandria, Egypt
Authors: Nihal Atif Salim
Abstract:
To improve the quality of life in cities, a variety of interventions are used. Noise is a substantial and important sort of pollution that has a negative impact on the urban environment and human health. According to the complaint survey, it ranks second among environmental contamination complaints (conducted by EEAA in 2019). The most significant source of noise in the city is traffic noise. In order to improve the sound urban environment, many physical techniques are applied. In the local area, noise barriers are considered as one of the most appropriate physical techniques along existing traffic routes. Alexandria is Egypt's second-largest city after Cairo. It is located along the Mediterranean Sea, and El- Gish Road is one of the city's main arteries. It impacts the waterfront promenade that extends along with the city by a high level of traffic noise. The purpose of this paper is to clarify the design considerations for the most appropriate noise barrier type along with the promenade, with the goal of improving the Quality of Life (QOL) and the sonic urban environment specifically. The proposed methodology focuses on how noise affects human perception and the environment. Then it delves into the various physical noise control approaches. After that, the paper discusses sustainable design decisions making. Finally, look into the importance of incorporating sustainability into design decisions making. Three stages will be followed in the case study. The first stage involves doing a site inspection and using specific sound measurement equipment (a noise level meter) to measure the noise level along the promenade at many sites, and the findings will be shown on a noise map. The second step is to inquire about the site's user experience. The third step is to investigate the various types of noise barriers and their effects on QOL along existing routes in order to select the most appropriate type. The goal of this research is to evaluate the suitable design of noise barriers that fulfill environmental and social perceptions while maintaining a balanced approach to the noise issue in order to improve QOL along existing roadways in the local area.Keywords: noise pollution, sonic urban environment, traffic noise, noise barrier, acoustic sustainability, noise reduction techniques
Procedia PDF Downloads 1385841 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach
Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika
Abstract:
Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments
Procedia PDF Downloads 2195840 Improvement of the Traditional Techniques of Artistic Casting through the Development of Open Source 3D Printing Technologies Based on Digital Ultraviolet Light Processing
Authors: Drago Diaz Aleman, Jose Luis Saorin Perez, Cecile Meier, Itahisa Perez Conesa, Jorge De La Torre Cantero
Abstract:
Traditional manufacturing techniques used in artistic contexts compete with highly productive and efficient industrial procedures. The craft techniques and associated business models tend to disappear under the pressure of the appearance of mass-produced products that compete in all niche markets, including those traditionally reserved for the work of art. The surplus value derived from the prestige of the author, the exclusivity of the product or the mastery of the artist, do not seem to be sufficient reasons to preserve this productive model. In the last years, the adoption of open source digital manufacturing technologies in small art workshops can favor their permanence by assuming great advantages such as easy accessibility, low cost, and free modification, adapting to specific needs of each workshop. It is possible to use pieces modeled by computer and made with FDM (Fused Deposition Modeling) 3D printers that use PLA (polylactic acid) in the procedures of artistic casting. Models printed by PLA are limited to approximate minimum sizes of 3 cm, and optimal layer height resolution is 0.1 mm. Due to these limitations, it is not the most suitable technology for artistic casting processes of smaller pieces. An alternative to solve size limitation, are printers from the type (SLS) "selective sintering by laser". And other possibility is a laser hardens, by layers, metal powder and called DMLS (Direct Metal Laser Sintering). However, due to its high cost, it is a technology that is difficult to introduce in small artistic foundries. The low-cost DLP (Digital Light Processing) type printers can offer high resolutions for a reasonable cost (around 0.02 mm on the Z axis and 0.04 mm on the X and Y axes), and can print models with castable resins that allow the subsequent direct artistic casting in precious metals or their adaptation to processes such as electroforming. In this work, the design of a DLP 3D printer is detailed, using backlit LCD screens with ultraviolet light. Its development is totally "open source" and is proposed as a kit made up of electronic components, based on Arduino and easy to access mechanical components in the market. The CAD files of its components can be manufactured in low-cost FDM 3D printers. The result is less than 500 Euros, high resolution and open-design with free access that allows not only its manufacture but also its improvement. In future works, we intend to carry out different comparative analyzes, which allow us to accurately estimate the print quality, as well as the real cost of the artistic works made with it.Keywords: traditional artistic techniques, DLP 3D printer, artistic casting, electroforming
Procedia PDF Downloads 1425839 Performance Evaluation and Economic Analysis of Minimum Quantity Lubrication with Pressurized/Non-Pressurized Air and Nanofluid Mixture
Authors: M. Amrita, R. R. Srikant, A. V. Sita Rama Raju
Abstract:
Water miscible cutting fluids are conventionally used to lubricate and cool the machining zone. But issues related to health hazards, maintenance and disposal costs have limited their usage, leading to application of Minimum Quantity Lubrication (MQL). To increase the effectiveness of MQL, nanocutting fluids are proposed. In the present work, water miscible nanographite cutting fluids of varying concentration are applied at cutting zone by two systems A and B. System A utilizes high pressure air and supplies cutting fluid at a flow rate of 1ml/min. System B uses low pressure air and supplies cutting fluid at a flow rate of 5ml/min. Their performance in machining is evaluated by measuring cutting temperatures, tool wear, cutting forces and surface roughness and compared with dry machining and flood machining. Application of nano cutting fluid using both systems showed better performance than dry machining. Cutting temperatures and cutting forces obtained by both techniques are more than flood machining. But tool wear and surface roughness showed improvement compared to flood machining. Economic analysis has been carried out in all the cases to decide the applicability of the techniques.Keywords: economic analysis, machining, minimum quantity lubrication, nanofluid
Procedia PDF Downloads 3805838 Mapping of Alteration Zones in Mineral Rich Belt of South-East Rajasthan Using Remote Sensing Techniques
Authors: Mrinmoy Dhara, Vivek K. Sengar, Shovan L. Chattoraj, Soumiya Bhattacharjee
Abstract:
Remote sensing techniques have emerged as an asset for various geological studies. Satellite images obtained by different sensors contain plenty of information related to the terrain. Digital image processing further helps in customized ways for the prospecting of minerals. In this study, an attempt has been made to map the hydrothermally altered zones using multispectral and hyperspectral datasets of South East Rajasthan. Advanced Space-borne Thermal Emission and Reflection Radiometer (ASTER) and Hyperion (Level1R) dataset have been processed to generate different Band Ratio Composites (BRCs). For this study, ASTER derived BRCs were generated to delineate the alteration zones, gossans, abundant clays and host rocks. ASTER and Hyperion images were further processed to extract mineral end members and classified mineral maps have been produced using Spectral Angle Mapper (SAM) method. Results were validated with the geological map of the area which shows positive agreement with the image processing outputs. Thus, this study concludes that the band ratios and image processing in combination play significant role in demarcation of alteration zones which may provide pathfinders for mineral prospecting studies.Keywords: ASTER, hyperion, band ratios, alteration zones, SAM
Procedia PDF Downloads 2795837 Vehicular Speed Detection Camera System Using Video Stream
Authors: C. A. Anser Pasha
Abstract:
In this paper, a new Vehicular Speed Detection Camera System that is applicable as an alternative to traditional radars with the same accuracy or even better is presented. The real-time measurement and analysis of various traffic parameters such as speed and number of vehicles are increasingly required in traffic control and management. Image processing techniques are now considered as an attractive and flexible method for automatic analysis and data collections in traffic engineering. Various algorithms based on image processing techniques have been applied to detect multiple vehicles and track them. The SDCS processes can be divided into three successive phases; the first phase is Objects detection phase, which uses a hybrid algorithm based on combining an adaptive background subtraction technique with a three-frame differencing algorithm which ratifies the major drawback of using only adaptive background subtraction. The second phase is Objects tracking, which consists of three successive operations - object segmentation, object labeling, and object center extraction. Objects tracking operation takes into consideration the different possible scenarios of the moving object like simple tracking, the object has left the scene, the object has entered the scene, object crossed by another object, and object leaves and another one enters the scene. The third phase is speed calculation phase, which is calculated from the number of frames consumed by the object to pass by the scene.Keywords: radar, image processing, detection, tracking, segmentation
Procedia PDF Downloads 467