Search results for: automatic image colorization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3486

Search results for: automatic image colorization

606 Expounding the Evolution of the Proto-Femme Fatale and Its Correlation with the New Woman: A Close Study of David Mamet's Oleanna

Authors: Silvia Elias

Abstract:

The 'Femme Fatale' figure has become synonymous with a mysterious and seductive woman whose charms captivate her lovers into bonds of irresistible desire, often leading them to compromise or downfall. Originally, a Femme Fatale typically uses her beauty to lead men to their destruction but in modern literature, she represents a direct attack on traditional womanhood and the nuclear family as she refuses to abide by the pillars of mainstream society creating an image of a strong independent woman who defies the control of men and rejects the institution of the family. This research aims at discussing the differences and similarities between the femme fatale and the New Woman and how they are perceived by the audience. There is often confusion between the characteristics that define a New Woman and a Femme Fatale since both women desire independence, challenge typical gender role casting, push against the limits of the patriarchal society and take control of their sexuality. The study of the femme fatale remains appealing in modern times because the fear of gender equality gives life to modern femme fatale versions and post-modern literary works introduce their readers to new versions of the deadly seductress. One that does not fully depend on her looks to destroy men. The idea behind writing this paper was born from reading David Mamet's two-character play Oleanna (1992) and tracing the main female protagonist/antagonist's transformation from a helpless inarticulate girl into a powerful controlling negotiator who knows how to lead a bargain and maintain the upper hand.

Keywords: Circe, David, Eve, evolution, feminist, femme fatale, gender, Mamet, new, Odysseus, Oleanna, power, Salome, schema, seduction, temptress, woman

Procedia PDF Downloads 455
605 Experimental and Numerical Performance Analysis for Steam Jet Ejectors

Authors: Abdellah Hanafi, G. M. Mostafa, Mohamed Mortada, Ahmed Hamed

Abstract:

The steam ejectors are the heart of most of the desalination systems that employ vacuum. The systems that employ low grade thermal energy sources like solar energy and geothermal energy use the ejector to drive the system instead of high grade electric energy. The jet-ejector is used to create vacuum employing the flow of steam or air and using the severe pressure drop at the outlet of the main nozzle. The present work involves developing a one dimensional mathematical model for designing jet-ejectors and transform it into computer code using Engineering Equation solver (EES) software. The model receives the required operating conditions at the inlets and outlet of the ejector as inputs and produces the corresponding dimensions required to reach these conditions. The one-dimensional model has been validated using an existed model working on Abu-Qir power station. A prototype has been designed according to the one-dimensional model and attached to a special test bench to be tested before using it in the solar desalination pilot plant. The tested ejector will be responsible for the startup evacuation of the system and adjusting the vacuum of the evaporating effects. The tested prototype has shown a good agreement with the results of the code. In addition a numerical analysis has been applied on one of the designed geometry to give an image of the pressure and velocity distribution inside the ejector from a side, and from other side, to show the difference in results between the two-dimensional ideal gas model and real prototype. The commercial edition of ANSYS Fluent v.14 software is used to solve the two-dimensional axisymmetric case.

Keywords: solar energy, jet ejector, vacuum, evaporating effects

Procedia PDF Downloads 620
604 Review of Capitalization of Construction Industry on Sustainable Risk Management in Nigeria

Authors: Nnadi Ezekiel Ejiofor

Abstract:

The construction industry plays a decisive role in the healthy development of any nation. Not only large but even small construction projects contribute to a country’s economic growth. There is a need for good management to ensure successful delivery and sustainability because of the plethora of risks that have resulted in low-profit margins for contractors, cost and schedule overruns, poor quality delivery, and abandoned projects. This research reviewed Capitalization on Sustainable Risk Management. Questionnaires and oral interviews conducted were utilized as means of data collection. One hundred and ninety-eight (198) large construction firms in Nigeria form the population of this study. 15 (fifteen) companies that emanated from merger and acquisition were used for the study. The instruments used for data collection were a researcher-developed structured questionnaire based on a five-point rating scale, interviews, focus group discussion, and secondary sources (bill of quantities and stock and exchange commission). The instrument was validated by two experts in the field. The reliability of the instrument was established by applying the split-half method. Kendall’s coefficient of concordance was used to test the data, and a degree of agreement was obtained. Data were subjected to descriptive statistics and analyzed using analysis of variance, t-test, and SPSS. The identified impacts of capitalization were an increase in turnover (24.5%), improvement in the image (24.5%), risk reduction (20%), business expansion (17.3%), and geographical spread (13.6%). The study strongly advocates the inclusion of risk management evaluation as part of the construction procurement process.

Keywords: capitalization, project delivery, risks, risk management, sustainability

Procedia PDF Downloads 59
603 Technical Aspects of Closing the Loop in Depth-of-Anesthesia Control

Authors: Gorazd Karer

Abstract:

When performing a diagnostic procedure or surgery in general anesthesia (GA), a proper introduction and dosing of anesthetic agents are one of the main tasks of the anesthesiologist. However, depth of anesthesia (DoA) also seems to be a suitable process for closed-loop control implementation. To implement such a system, one must be able to acquire the relevant signals online and in real-time, as well as stream the calculated control signal to the infusion pump. However, during a procedure, patient monitors and infusion pumps are purposely unable to connect to an external (possibly medically unapproved) device for safety reasons, thus preventing closed-loop control. The paper proposes a conceptual solution to the aforementioned problem. First, it presents some important aspects of contemporary clinical practice. Next, it introduces the closed-loop-control-system structure and the relevant information flow. Focusing on transferring the data from the patient to the computer, it presents a non-invasive image-based system for signal acquisition from a patient monitor for online depth-of-anesthesia assessment. Furthermore, it introduces a UDP-based communication method that can be used for transmitting the calculated anesthetic inflow to the infusion pump. The proposed system is independent of a medical device manufacturer and is implemented in Matlab-Simulink, which can be conveniently used for DoA control implementation. The proposed scheme has been tested in a simulated GA setting and is ready to be evaluated in an operating theatre. However, the proposed system is only a step towards a proper closed-loop control system for DoA, which could routinely be used in clinical practice.

Keywords: closed-loop control, depth of anesthesia (DoA), modeling, optical signal acquisition, patient state index (PSi), UDP communication protocol

Procedia PDF Downloads 217
602 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures

Authors: Solanki Ravirajsinh

Abstract:

In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.

Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services

Procedia PDF Downloads 27
601 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 88
600 The Issue of Pedagogical Approaches in Higher Education: Public Universities as an Example

Authors: Majda El Moufarej

Abstract:

Higher education plays a central role in socio-economic development. However, with the wave of change mainly due to the extensive use of technology in the workplace, the rate of unemployment among graduates rises because they lack the appropriate competencies and skills currently required in professional life. This situation has led higher education institutions worldwide to reconsider their missions, strategic planning, and curricula, among other elements to redress the image of the university as expected. When it comes to practice, there are many obstacles that hinder the achievement of the expected objectives, especially in public universities with free access, as in the case of Morocco. Nevertheless, huge efforts have been made by educational managers to improve the quality of education by focusing on the issue of pedagogical approaches, where university teachers assume more responsibility to save the situation. In this paper, the focus will be placed on the issue of pedagogical approaches to be adopted, depending on the nature of the subject, the size of the class, the available equipment, the students’ level and degree of motivation. Before elaborating on this idea, it may be more insightful to begin by addressing another variable, which concerns the new role of university teachers and their qualification in pedagogical competence. Then, the discussion will revolve around five pedagogical approaches currently adopted in western universities and the focus will be exclusively placed on the one which is called “the Systematic Approach to course Design”, due to its crucial relevance in the teaching of subjects in the schools of humanities, as it can guide the teacher in the development of an explicit program for purposeful teaching and learning. The study is based on a qualitative method, and the findings will be analyzed and followed by some recommendations about how to overcome difficulties in teaching large groups, while transmitting the relevant knowledge and skills on demand in the workplace.

Keywords: higher education, public universities, pedagogical approaches, pedagogical competence

Procedia PDF Downloads 297
599 Designing Agricultural Irrigation Systems Using Drone Technology and Geospatial Analysis

Authors: Yongqin Zhang, John Lett

Abstract:

Geospatial technologies have been increasingly used in agriculture for various applications and purposes in recent years. Unmanned aerial vehicles (drones) fit the needs of farmers in farming operations, from field spraying to grow cycles and crop health. In this research, we conducted a practical research project that used drone technology to design and map optimal locations and layouts of irrigation systems for agriculture farms. We flew a DJI Mavic 2 Pro drone to acquire aerial remote sensing images over two agriculture fields in Forest, Mississippi, in 2022. Flight plans were first designed to capture multiple high-resolution images via a 20-megapixel RGB camera mounted on the drone over the agriculture fields. The Drone Deploy web application was then utilized to develop flight plans and subsequent image processing and measurements. The images were orthorectified and processed to estimate the area of the area and measure the locations of the water line and sprinkle heads. Field measurements were conducted to measure the ground targets and validate the aerial measurements. Geospatial analysis and photogrammetric measurements were performed for the study area to determine optimal layout and quantitative estimates for irrigation systems. We created maps and tabular estimates to demonstrate the locations, spacing, amount, and layout of sprinkler heads and water lines to cover the agricultural fields. This research project provides scientific guidance to Mississippi farmers for a precision agricultural irrigation practice.

Keywords: drone images, agriculture, irrigation, geospatial analysis, photogrammetric measurements

Procedia PDF Downloads 75
598 Application of Remote Sensing for Monitoring the Impact of Lapindo Mud Sedimentation for Mangrove Ecosystem, Case Study in Sidoarjo, East Java

Authors: Akbar Cahyadhi Pratama Putra, Tantri Utami Widhaningtyas, M. Randy Aswin

Abstract:

Indonesia as an archipelagic nation have very long coastline which have large potential marine resources, one of that is the mangrove ecosystems. Lapindo mudflow disaster in Sidoarjo, East Java requires mudflow flowed into the sea through the river Brantas and Porong. Mud material that transported by river flow is feared dangerous because they contain harmful substances such as heavy metals. This study aims to map the mangrove ecosystem seen from its density and knowing how big the impact of a disaster on the Lapindo mud to mangrove ecosystem and accompanied by efforts to address the mangrove ecosystem that maintained continuity. Mapping coastal mangrove conditions of Sidoarjo was done using remote sensing products that Landsat 7 ETM + images with dry months of recording time in 2002, 2006, 2009, and 2014. The density of mangrove detected using NDVI that uses the band 3 that is the red channel and band 4 that is near IR channel. Image processing was used to produce NDVI using ENVI 5.1 software. NDVI results were used for the detection of mangrove density is 0-1. The development of mangrove ecosystems of both area and density from year to year experienced has a significant increase. Mangrove ecosystems growths are affected by material deposition area of Lapindo mud on Porong and Brantas river estuary, where the silt is growing medium suitable mangrove ecosystem and increasingly growing. Increasing the density caused support by public awareness to prevent heavy metals in the material so that the Lapindo mud mangrove breeding done around the farm.

Keywords: archipelagic nation, mangrove, Lapindo mudflow disaster, NDVI

Procedia PDF Downloads 438
597 The Image of Victim and Criminal in Love Crimes on Social Media in Egypt: Facebook Discourse Analysis

Authors: Sherehan Hamdalla

Abstract:

Egypt has experienced a series of terrifying love crimes in the last few months. This ‘trend’ of love crimes started with a young man caught on video slaughtering his ex-girlfriend in the street in the city of El Mansoura. The crime shocked all Egyptian citizens at all levels; unfortunately, not less than three similar crimes took place in other different Egyptian cities with the same killing trigger. The characteristics and easy access and reach of social media consider the reason why it is one of the most crucial online communication channels; users utilize social media platforms for sharing and exchanging ideas, news, and many other activities; they can freely share posts that reflect their mindset or personal views regarding any issues, these posts are going viral in all social media account by reposting or numbers of shares for these posts to support the content included, or even to attack. The repetition of sharing certain posts could mobilize other supporters with the same point of view, especially when that crowd’s online participation is confronting a public opinion case’s consequences. The death of that young woman was followed by similar crimes in other cities, such as El Sharkia and Port Said. These love crimes provoked a massive wave of contention among all social classes in Egypt. Strangely, some were supporting the criminal and defending his side for several reasons, which the study will uncover. Facebook, the most popular social media platform for Egyptians, reflects the debate between supporters of the victim and supporters of the criminal. Facebook pages were created specifically to disseminate certain viewpoints online, for example, asking for the maximum penalty to be given to criminals. These pages aimed to mobilize the maximum number of supporters and to affect the outcome of the trials.

Keywords: love crimes, victim, criminal, social media

Procedia PDF Downloads 76
596 Final Account Closing in Construction Project: The Use of Supply Chain Management to Reduce the Delays

Authors: Zarabizan Zakaria, Syuhaida Ismail, Aminah Md. Yusof

Abstract:

Project management process starts from the planning stage up to the stage of completion (handover of buildings, preparation of the final accounts and the closing balance). This process is not easy to implement efficiently and effectively. The issue of delays in construction is a major problem for construction projects. These delays have been blamed mainly on inefficient traditional construction practices that continue to dominate the current industry. This is due to several factors, such as environments of construction technology, sophisticated design and customer demands that are constantly changing and influencing, either directly or indirectly, the practice of management. Among the identified influences are physical environment, social environment, information environment, political and moral atmosphere. Therefore, this paper is emerged to determine the problem and issues in the final account closing in construction projects, and it establishes the need to embrace Supply Chain Management (SCM) and then elucidates the need and strategies for the development of a delay reduction framework. At the same time, this paper provides effective measures to avoid or at least reduce the delay to the optimum level. Allowing problems in the closure declaration to occur without proper monitoring and control can leave negative impact on the cost and time of delivery to the end user. Besides, it can also affect the reputation or image of the agency/department that manages the implementation of a contract and consequently may reduce customer's trust towards the agencies/departments. It is anticipated that the findings reported in this paper could address root delay contributors and apply SCM tools for their mitigation for the better development of construction project.

Keywords: final account closing, construction project, construction delay, supply chain management

Procedia PDF Downloads 366
595 Use of Socially Assistive Robots in Early Rehabilitation to Promote Mobility for Infants with Motor Delays

Authors: Elena Kokkoni, Prasanna Kannappan, Ashkan Zehfroosh, Effrosyni Mavroudi, Kristina Strother-Garcia, James C. Galloway, Jeffrey Heinz, Rene Vidal, Herbert G. Tanner

Abstract:

Early immobility affects the motor, cognitive, and social development. Current pediatric rehabilitation lacks the technology that will provide the dosage needed to promote mobility for young children at risk. The addition of socially assistive robots in early interventions may help increase the mobility dosage. The aim of this study is to examine the feasibility of an early intervention paradigm where non-walking infants experience independent mobility while socially interacting with robots. A dynamic environment is developed where both the child and the robot interact and learn from each other. The environment involves: 1) a range of physical activities that are goal-oriented, age-appropriate, and ability-matched for the child to perform, 2) the automatic functions that perceive the child’s actions through novel activity recognition algorithms, and decide appropriate actions for the robot, and 3) a networked visual data acquisition system that enables real-time assessment and provides the means to connect child behavior with robot decision-making in real-time. The environment was tested by bringing a two-year old boy with Down syndrome for eight sessions. The child presented delays throughout his motor development with the current being on the acquisition of walking. During the sessions, the child performed physical activities that required complex motor actions (e.g. climbing an inclined platform and/or staircase). During these activities, a (wheeled or humanoid) robot was either performing the action or was at its end point 'signaling' for interaction. From these sessions, information was gathered to develop algorithms to automate the perception of activities which the robot bases its actions on. A Markov Decision Process (MDP) is used to model the intentions of the child. A 'smoothing' technique is used to help identify the model’s parameters which are a critical step when dealing with small data sets such in this paradigm. The child engaged in all activities and socially interacted with the robot across sessions. With time, the child’s mobility was increased, and the frequency and duration of complex and independent motor actions were also increased (e.g. taking independent steps). Simulation results on the combination of the MDP and smoothing support the use of this model in human-robot interaction. Smoothing facilitates learning MDP parameters from small data sets. This paradigm is feasible and provides an insight on how social interaction may elicit mobility actions suggesting a new early intervention paradigm for very young children with motor disabilities. Acknowledgment: This work has been supported by NIH under grant #5R01HD87133.

Keywords: activity recognition, human-robot interaction, machine learning, pediatric rehabilitation

Procedia PDF Downloads 292
594 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques

Authors: Gizem Eser Erdek

Abstract:

This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.

Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet

Procedia PDF Downloads 76
593 Effect of Underwater Antiquities as a Hidden Competitive Advantage of Hotels on Their Financial Performance: An Exploratory Study

Authors: Iman Shawky, Mohamed Elsayed

Abstract:

Every hotel works in the hospitality market tends to have its own merit and character in its products marketing in order to maintain both its brand's identity and image among guests. According to the growth of global competition in the hospitality industry; the concept of competitive advantage is becoming increasingly important in hotels' marketing world as it examines reasons for outweighing hotels in their dimensions of strategic and marketing plans. In fact, Egypt is the land of appeared and submerged secrets as a result of its ancient civilization ongoing explorations. Although underwater antiquities represent ambiguous treasures, they have auspicious future in it, particularly in Alexandria. The study aims at examining to what extent underwater antiquities represent a competitive advantage of four and five-star hotels in Alexandria. For achieving this aim, an exploratory study conducted by currying out the investigation and comparison of the closest and most popular landmarks mentioned on both hotels' official websites and on common used reservations' websites. In addition to that, two different questionnaire forms designed; one for both revenue and sales and marketing hotels' managers while the other for their guests. The results indicate that both official hotels' websites and the most common used reservations' websites totally ignore mentioning underwater antiquities as attractive landmarks surrounding Alexandria hotels. Furthermore, most managers expect that underwater antiquities can furnish distinguished competitive advantage to their hotels. Also, they can help exceeding guests' expectations during their accommodation as long as they included on both official hotels' and reservations' websites as the most surrounding famous landmarks. Moreover, most managers foresee that high awareness of underwater antiquities can enhance the guests' accommodation frequencies and improve the financial performance of their hotels.

Keywords: competitive advantage, financial performance, hotels' websites, underwater antiquities

Procedia PDF Downloads 166
592 Computer-Aided Diagnosis System Based on Multiple Quantitative Magnetic Resonance Imaging Features in the Classification of Brain Tumor

Authors: Chih Jou Hsiao, Chung Ming Lo, Li Chun Hsieh

Abstract:

Brain tumor is not the cancer having high incidence rate, but its high mortality rate and poor prognosis still make it as a big concern. On clinical examination, the grading of brain tumors depends on pathological features. However, there are some weak points of histopathological analysis which can cause misgrading. For example, the interpretations can be various without a well-known definition. Furthermore, the heterogeneity of malignant tumors is a challenge to extract meaningful tissues under surgical biopsy. With the development of magnetic resonance imaging (MRI), tumor grading can be accomplished by a noninvasive procedure. To improve the diagnostic accuracy further, this study proposed a computer-aided diagnosis (CAD) system based on MRI features to provide suggestions of tumor grading. Gliomas are the most common type of malignant brain tumors (about 70%). This study collected 34 glioblastomas (GBMs) and 73 lower-grade gliomas (LGGs) from The Cancer Imaging Archive. After defining the region-of-interests in MRI images, multiple quantitative morphological features such as region perimeter, region area, compactness, the mean and standard deviation of the normalized radial length, and moment features were extracted from the tumors for classification. As results, two of five morphological features and three of four image moment features achieved p values of <0.001, and the remaining moment feature had p value <0.05. Performance of the CAD system using the combination of all features achieved the accuracy of 83.18% in classifying the gliomas into LGG and GBM. The sensitivity is 70.59% and the specificity is 89.04%. The proposed system can become a second viewer on clinical examinations for radiologists.

Keywords: brain tumor, computer-aided diagnosis, gliomas, magnetic resonance imaging

Procedia PDF Downloads 260
591 Design of a Small and Medium Enterprise Growth Prediction Model Based on Web Mining

Authors: Yiea Funk Te, Daniel Mueller, Irena Pletikosa Cvijikj

Abstract:

Small and medium enterprises (SMEs) play an important role in the economy of many countries. When the overall world economy is considered, SMEs represent 95% of all businesses in the world, accounting for 66% of the total employment. Existing studies show that the current business environment is characterized as highly turbulent and strongly influenced by modern information and communication technologies, thus forcing SMEs to experience more severe challenges in maintaining their existence and expanding their business. To support SMEs at improving their competitiveness, researchers recently turned their focus on applying data mining techniques to build risk and growth prediction models. However, data used to assess risk and growth indicators is primarily obtained via questionnaires, which is very laborious and time-consuming, or is provided by financial institutes, thus highly sensitive to privacy issues. Recently, web mining (WM) has emerged as a new approach towards obtaining valuable insights in the business world. WM enables automatic and large scale collection and analysis of potentially valuable data from various online platforms, including companies’ websites. While WM methods have been frequently studied to anticipate growth of sales volume for e-commerce platforms, their application for assessment of SME risk and growth indicators is still scarce. Considering that a vast proportion of SMEs own a website, WM bears a great potential in revealing valuable information hidden in SME websites, which can further be used to understand SME risk and growth indicators, as well as to enhance current SME risk and growth prediction models. This study aims at developing an automated system to collect business-relevant data from the Web and predict future growth trends of SMEs by means of WM and data mining techniques. The envisioned system should serve as an 'early recognition system' for future growth opportunities. In an initial step, we examine how structured and semi-structured Web data in governmental or SME websites can be used to explain the success of SMEs. WM methods are applied to extract Web data in a form of additional input features for the growth prediction model. The data on SMEs provided by a large Swiss insurance company is used as ground truth data (i.e. growth-labeled data) to train the growth prediction model. Different machine learning classification algorithms such as the Support Vector Machine, Random Forest and Artificial Neural Network are applied and compared, with the goal to optimize the prediction performance. The results are compared to those from previous studies, in order to assess the contribution of growth indicators retrieved from the Web for increasing the predictive power of the model.

Keywords: data mining, SME growth, success factors, web mining

Procedia PDF Downloads 267
590 Drape Simulation by Commercial Software and Subjective Assessment of Virtual Drape

Authors: Evrim Buyukaslan, Simona Jevsnik, Fatma Kalaoglu

Abstract:

Simulation of fabrics is more difficult than any other simulation due to complex mechanics of fabrics. Most of the virtual garment simulation software use mass-spring model and incorporate fabric mechanics into simulation models. The accuracy and fidelity of these virtual garment simulation software is a question mark. Drape is a subjective phenomenon and evaluation of drape has been studied since 1950’s. On the other hand, fabric and garment simulation is relatively new. Understanding drape perception of subjects when looking at fabric simulations is critical as virtual try-on becomes more of an issue by enhanced online apparel sales. Projected future of online apparel retailing is that users may view their avatars and try-on the garment on their avatars in the virtual environment. It is a well-known fact that users will not be eager to accept this innovative technology unless it is realistic enough. Therefore, it is essential to understand what users see when they are displaying fabrics in a virtual environment. Are they able to distinguish the differences between various fabrics in virtual environment? The purpose of this study is to investigate human perception when looking at a virtual fabric and determine the most visually noticeable drape parameter. To this end, five different fabrics are mechanically tested, and their drape simulations are generated by commercial garment simulation software (Optitex®). The simulation images are processed by an image analysis software to calculate drape parameters namely; drape coefficient, node severity, and peak angles. A questionnaire is developed to evaluate drape properties subjectively in a virtual environment. Drape simulation images are shown to 27 subjects and asked to rank the samples according to their questioned drape property. The answers are compared to the calculated drape parameters. The results show that subjects are quite sensitive to drape coefficient changes while they are not very sensitive to changes in node dimensions and node distributions.

Keywords: drape simulation, drape evaluation, fabric mechanics, virtual fabric

Procedia PDF Downloads 338
589 Synthesis, Characterization of Organic and Inorganic Zn-Al Layered Double Hydroxides and Application for the Uptake of Methyl Orange from Aqueous Solution

Authors: Fatima Zahra Mahjoubi, Abderrahim Khalidi, Mohammed Abdennouri, Noureddine Barka

Abstract:

Zn-Al layered double hydroxides containing carbonate, nitrate and dodecylsulfate as the interlamellar anions have been prepared through a coprecipitation method. The resulting compounds were characterized using XRD, ICP, FTIR, TGA/DTA, TEM/EDX and pHPZC analysis. The XRD patterns revealed that carbonate and nitrate could be intercalated into the interlayer structure with basal spacing of 22.74 and 26.56 Å respectively. Bilayer intercalation of dodecylsulfate molecules was achieved in Zn-Al LDH with a basal spacing of 37.86 Å. The TEM observation indicated that the materials synthesized via coprecipitation present nanoscale LDH particle. The average particle size of Zn-AlCO3 is 150 to 200 nm. Irregular circular to hexagonal shaped particles with 30 to 40 nm in diameter was observed in the Zn-AlNO3 morphology. TEM image of Zn-AlDs display nanostructured sheet like particles with size distribution between 5 to 10 nm. The sorption characteristics and mechanisms of methyl orange dye on organic LDH were investigated and were subsequently compared with that on the inorganic Zn-Al layered double hydroxides. Adsorption experiments for MO were carried out as function of solution pH, contact time and initial dye concentration. The adsorption behavior onto inorganic LDHs was obviously influenced by initial pH. However, the adsorption capacity of organic LDH was influenced indistinctively by initial pH and the removal percentage of MO was practically constant at various value of pH. As the MO concentration increased, the curve of adsorption capacity became L-type onto LDHs. The adsorption behavior for Zn-AlDs was proposed by the dissolution of dye in a hydrophobic interlayer region (i.e., adsolubilization). The results suggested that Zn-AlDs could be applied as a potential adsorbent for MO removal in a wide range of pH.

Keywords: adsorption, dodecylsulfate, kinetics, layered double hydroxides, methyl orange removal

Procedia PDF Downloads 293
588 The Quantum Theory of Music and Languages

Authors: Mballa Abanda Serge, Henda Gnakate Biba, Romaric Guemno Kuate, Akono Rufine Nicole, Petfiang Sidonie, Bella Sidonie

Abstract:

The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation and the question of modeling in the human sciences: mathematics, computer science, translation automation and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal and random music. The experimentation confirming the theorization, It designed a semi-digital, semi-analog application which translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music and deterministic and random music). To test this application, I use a music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). Translation is done (from writing to writing, from writing to speech and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz and world music or variety etc. The software runs, giving you the option to choose harmonies, and then you select your melody.

Keywords: music, entanglement, langauge, science

Procedia PDF Downloads 80
587 A Review of Deep Learning Methods in Computer-Aided Detection and Diagnosis Systems based on Whole Mammogram and Ultrasound Scan Classification

Authors: Ian Omung'a

Abstract:

Breast cancer remains to be one of the deadliest cancers for women worldwide, with the risk of developing tumors being as high as 50 percent in Sub-Saharan African countries like Kenya. With as many as 42 percent of these cases set to be diagnosed late when cancer has metastasized and or the prognosis has become terminal, Full Field Digital [FFD] Mammography remains an effective screening technique that leads to early detection where in most cases, successful interventions can be made to control or eliminate the tumors altogether. FFD Mammograms have been proven to multiply more effective when used together with Computer-Aided Detection and Diagnosis [CADe] systems, relying on algorithmic implementations of Deep Learning techniques in Computer Vision to carry out deep pattern recognition that is comparable to the level of a human radiologist and decipher whether specific areas of interest in the mammogram scan image portray abnormalities if any and whether these abnormalities are indicative of a benign or malignant tumor. Within this paper, we review emergent Deep Learning techniques that will prove relevant to the development of State-of-The-Art FFD Mammogram CADe systems. These techniques will span self-supervised learning for context-encoded occlusion, self-supervised learning for pre-processing and labeling automation, as well as the creation of a standardized large-scale mammography dataset as a benchmark for CADe systems' evaluation. Finally, comparisons are drawn between existing practices that pre-date these techniques and how the development of CADe systems that incorporate them will be different.

Keywords: breast cancer diagnosis, computer aided detection and diagnosis, deep learning, whole mammogram classfication, ultrasound classification, computer vision

Procedia PDF Downloads 93
586 Automatic Identification and Classification of Contaminated Biodegradable Plastics using Machine Learning Algorithms and Hyperspectral Imaging Technology

Authors: Nutcha Taneepanichskul, Helen C. Hailes, Mark Miodownik

Abstract:

Plastic waste has emerged as a critical global environmental challenge, primarily driven by the prevalent use of conventional plastics derived from petrochemical refining and manufacturing processes in modern packaging. While these plastics serve vital functions, their persistence in the environment post-disposal poses significant threats to ecosystems. Addressing this issue necessitates approaches, one of which involves the development of biodegradable plastics designed to degrade under controlled conditions, such as industrial composting facilities. It is imperative to note that compostable plastics are engineered for degradation within specific environments and are not suited for uncontrolled settings, including natural landscapes and aquatic ecosystems. The full benefits of compostable packaging are realized when subjected to industrial composting, preventing environmental contamination and waste stream pollution. Therefore, effective sorting technologies are essential to enhance composting rates for these materials and diminish the risk of contaminating recycling streams. In this study, it leverage hyperspectral imaging technology (HSI) coupled with advanced machine learning algorithms to accurately identify various types of plastics, encompassing conventional variants like Polyethylene terephthalate (PET), Polypropylene (PP), Low density polyethylene (LDPE), High density polyethylene (HDPE) and biodegradable alternatives such as Polybutylene adipate terephthalate (PBAT), Polylactic acid (PLA), and Polyhydroxyalkanoates (PHA). The dataset is partitioned into three subsets: a training dataset comprising uncontaminated conventional and biodegradable plastics, a validation dataset encompassing contaminated plastics of both types, and a testing dataset featuring real-world packaging items in both pristine and contaminated states. Five distinct machine learning algorithms, namely Partial Least Squares Discriminant Analysis (PLS-DA), Support Vector Machine (SVM), Convolutional Neural Network (CNN), Logistic Regression, and Decision Tree Algorithm, were developed and evaluated for their classification performance. Remarkably, the Logistic Regression and CNN model exhibited the most promising outcomes, achieving a perfect accuracy rate of 100% for the training and validation datasets. Notably, the testing dataset yielded an accuracy exceeding 80%. The successful implementation of this sorting technology within recycling and composting facilities holds the potential to significantly elevate recycling and composting rates. As a result, the envisioned circular economy for plastics can be established, thereby offering a viable solution to mitigate plastic pollution.

Keywords: biodegradable plastics, sorting technology, hyperspectral imaging technology, machine learning algorithms

Procedia PDF Downloads 79
585 Exploring the Spatial Relationship between Built Environment and Ride-hailing Demand: Applying Street-Level Images

Authors: Jingjue Bao, Ye Li, Yujie Qi

Abstract:

The explosive growth of ride-hailing has reshaped residents' travel behavior and plays a crucial role in urban mobility within the built environment. Contributing to the research of the spatial variation of ride-hailing demand and its relationship to the built environment and socioeconomic factors, this study utilizes multi-source data from Haikou, China, to construct a Multi-scale Geographically Weighted Regression model (MGWR), considering spatial scale heterogeneity. The regression results showed that MGWR model was demonstrated superior interpretability and reliability with an improvement of 3.4% on R2 and from 4853 to 4787 on AIC, compared with Geographically Weighted Regression model (GWR). Furthermore, to precisely identify the surrounding environment of sampling point, DeepLabv3+ model is employed to segment street-level images. Features extracted from these images are incorporated as variables in the regression model, further enhancing its rationality and accuracy by 7.78% improvement on R2 compared with the MGWR model only considered region-level variables. By integrating multi-scale geospatial data and utilizing advanced computer vision techniques, this study provides a comprehensive understanding of the spatial dynamics between ride-hailing demand and the urban built environment. The insights gained from this research are expected to contribute significantly to urban transportation planning and policy making, as well as ride-hailing platforms, facilitating the development of more efficient and effective mobility solutions in modern cities.

Keywords: travel behavior, ride-hailing, spatial relationship, built environment, street-level image

Procedia PDF Downloads 81
584 Comparative Ethnography and Urban Health: A Multisite Study on Obesogenic Cities

Authors: Carlos Rios Llamas

Abstract:

Urban health challenges, like the obesity epidemic, need to be studied from a dialogue between different disciplines and geographical conditions. Public health uses quantitative analysis and local samples, but qualitative data and multisite analysis would help to better understand how obesity has become a health problem. In the last decades, obesity rates have increased in most of the countries, especially in the Western World. Concerned about the problem, the American Medical Association has recently voted obesity as a disease. Suddenly, a ‘war on obesity’ attracted scientists from different disciplines to explore various ways to control and even reverse the trends. Medical sciences have taken the advance with quantitative methodologies focused on individual behaviors. Only a few scientist have extended their studies to the environment where obesity is produced as social risk, and less of them have taken into consideration the political and cultural aspects. This paper presents a multisite ethnography in South Bronx, USA, La Courneuve, France, and Lomas del Sur, Mexico, where obesity rates are as relevant as urban degradation. The comparative ethnography offers a possibility to unveil the mechanisms producing health risks from the urban tissue. The analysis considers three main categories: 1) built environment and access to food and physical activity, 2) biocultural construction of the healthy body, 3) urban inequalities related to health and body size. Major findings from a comparative ethnography on obesogenic environments, refer to the anthropological values related to food and body image, as well as the multidimensional oppression expressed in fat people who live in stigmatized urban zones. At the end, obesity, like many other diseases, is the result of political and cultural constructions structured in urbanization processes.

Keywords: comparative ethnography, urban health, obesogenic cities, biopolitics

Procedia PDF Downloads 246
583 Bismuth Telluride Topological Insulator: Physical Vapor Transport vs Molecular Beam Epitaxy

Authors: Omar Concepcion, Osvaldo De Melo, Arturo Escobosa

Abstract:

Topological insulator (TI) materials are insulating in the bulk and conducting in the surface. The unique electronic properties associated with these surface states make them strong candidates for exploring innovative quantum phenomena and as practical applications for quantum computing, spintronic and nanodevices. Many materials, including Bi₂Te₃, have been proposed as TIs and, in some cases, it has been demonstrated experimentally by angle-resolved photoemission spectroscopy (ARPES), scanning tunneling spectroscopy (STM) and/or magnetotransport measurements. A clean surface is necessary in order to make any of this measurements. Several techniques have been used to produce films and different kinds of nanostructures. Growth and characterization in situ is usually the best option although cleaving the films can be an alternative to have a suitable surface. In the present work, we report a comparison of Bi₂Te₃ grown by physical vapor transport (PVT) and molecular beam epitaxy (MBE). The samples were characterized by X-ray diffraction (XRD), Scanning electron microscopy (SEM), Atomic force microscopy (AFM), X-ray photoelectron spectroscopy (XPS) and ARPES. The Bi₂Te₃ samples grown by PVT, were cleaved in the ultra-high vacuum in order to obtain a surface free of contaminants. In both cases, the XRD shows a c-axis orientation and the pole diagrams proved the epitaxial relationship between film and substrate. The ARPES image shows the linear dispersion characteristic of the surface states of the TI materials. The samples grown by PVT, a relatively simple and cost-effective technique shows the same high quality and TI properties than the grown by MBE.

Keywords: Bismuth telluride, molecular beam epitaxy, physical vapor transport, topological insulator

Procedia PDF Downloads 192
582 Remote Sensing and GIS-Based Environmental Monitoring by Extracting Land Surface Temperature of Abbottabad, Pakistan

Authors: Malik Abid Hussain Khokhar, Muhammad Adnan Tahir, Hisham Bin Hafeez Awan

Abstract:

Continuous environmental determinism and climatic change in the entire globe due to increasing land surface temperature (LST) has become a vital phenomenon nowadays. LST is accelerating because of increasing greenhouse gases in the environment which results of melting down ice caps, ice sheets and glaciers. It has not only worse effects on vegetation and water bodies of the region but has also severe impacts on monsoon areas in the form of capricious rainfall and monsoon failure extensive precipitation. Environment can be monitored with the help of various geographic information systems (GIS) based algorithms i.e. SC (Single), DA (Dual Angle), Mao, Sobrino and SW (Split Window). Estimation of LST is very much possible from digital image processing of satellite imagery. This paper will encompass extraction of LST of Abbottabad using SW technique of GIS and Remote Sensing over last ten years by means of Landsat 7 ETM+ (Environmental Thematic Mapper) and Landsat 8 vide their Thermal Infrared (TIR Sensor) and Optical Land Imager (OLI sensor less Landsat 7 ETM+) having 100 m TIR resolution and 30 m Spectral Resolutions. These sensors have two TIR bands each; their emissivity and spectral radiance will be used as input statistics in SW algorithm for LST extraction. Emissivity will be derived from Normalized Difference Vegetation Index (NDVI) threshold methods using 2-5 bands of OLI with the help of e-cognition software, and spectral radiance will be extracted TIR Bands (Band 10-11 and Band 6 of Landsat 7 ETM+). Accuracy of results will be evaluated by weather data as well. The successive research will have a significant role for all tires of governing bodies related to climate change departments.

Keywords: environment, Landsat 8, SW Algorithm, TIR

Procedia PDF Downloads 355
581 3D Human Face Reconstruction in Unstable Conditions

Authors: Xiaoyuan Suo

Abstract:

3D object reconstruction is a broad research area within the computer vision field involving many stages and still open problems. One of the existing challenges in this field lies with micromotion, such as the facial expressions on the appearance of the human or animal face. Similar literatures in this field focuses on 3D reconstruction in stable conditions such as an existing image or photos taken in a rather static environment, while the purpose of this work is to discuss a flexible scan system using multiple cameras that can correctly reconstruct 3D stable and moving objects -- human face with expression in particular. Further, a mathematical model is proposed at the end of this literature to automate the 3D object reconstruction process. The reconstruction process takes several stages. Firstly, a set of simple 2D lines would be projected onto the object and hence a set of uneven curvy lines can be obtained, which represents the 3D numerical data of the surface. The lines and their shapes will help to identify object’s 3D construction in pixels. With the two-recorded angles and their distance from the camera, a simple mathematical calculation would give the resulting coordinate of each projected line in an absolute 3D space. This proposed research will benefit many practical areas, including but not limited to biometric identification, authentications, cybersecurity, preservation of cultural heritage, drama acting especially those with rapid and complex facial gestures, and many others. Specifically, this will (I) provide a brief survey of comparable techniques existing in this field. (II) discuss a set of specialized methodologies or algorithms for effective reconstruction of 3D objects. (III)implement, and testing the developed methodologies. (IV) verify findings with data collected from experiments. (V) conclude with lessons learned and final thoughts.

Keywords: 3D photogrammetry, 3D object reconstruction, facial expression recognition, facial recognition

Procedia PDF Downloads 150
580 A Review of Accuracy Optical Surface Imaging Systems for Setup Verification During Breast Radiotherapy Treatment

Authors: Auwal Abubakar, Ahmed Ahidjo, Shazril Imran Shaukat, Noor Khairiah A. Karim, Gokula Kumar Appalanaido, Hafiz Mohd Zin

Abstract:

Background: The use of optical surface imaging systems (OSISs) is increasingly becoming popular in radiotherapy practice, especially during breast cancer treatment. This study reviews the accuracy of the available commercial OSISs for breast radiotherapy. Method: A literature search was conducted and identified the available commercial OSISs from different manufacturers that are integrated into radiotherapy practice for setup verification during breast radiotherapy. Studies that evaluated the accuracy of the OSISs during breast radiotherapy using cone beam computed tomography (CBCT) as a reference were retrieved and analyzed. The physics and working principles of the systems from each manufacturer were discussed together with their respective strength and limitations. Results: A total of five (5) different commercially available OSISs from four (4) manufacturers were identified, each with a different working principle. Six (6) studies were found to evaluate the accuracy of the systems during breast radiotherapy in conjunction with CBCT as a goal standard. The studies revealed that the accuracy of the system in terms of mean difference ranges from 0.1 to 2.1 mm. The correlation between CBCT and OSIS ranges between 0.4 and 0.9. The limit of agreements obtained using bland Altman analysis in the studies was also within an acceptable range. Conclusion: The OSISs have an acceptable level of accuracy and could be used safely during breast radiotherapy. The systems are non-invasive, ionizing radiation-free, and provide real-time imaging of the target surface at no extra concomitant imaging dose. However, the system should only be used to complement rather than replace x-ray-based image guidance techniques such as CBCT.

Keywords: optical surface imaging system, Cone beam computed tomography (CBCT), surface guided radiotherapy, Breast radiotherapy

Procedia PDF Downloads 66
579 European Commission Radioactivity Environmental Monitoring Database REMdb: A Law (Art. 36 Euratom Treaty) Transformed in Environmental Science Opportunities

Authors: M. Marín-Ferrer, M. A. Hernández, T. Tollefsen, S. Vanzo, E. Nweke, P. V. Tognoli, M. De Cort

Abstract:

Under the terms of Article 36 of the Euratom Treaty, European Union Member States (MSs) shall periodically communicate to the European Commission (EC) information on environmental radioactivity levels. Compilations of the information received have been published by the EC as a series of reports beginning in the early 1960s. The environmental radioactivity results received from the MSs have been introduced into the Radioactivity Environmental Monitoring database (REMdb) of the Institute for Transuranium Elements of the EC Joint Research Centre (JRC) sited in Ispra (Italy) as part of its Directorate General for Energy (DG ENER) support programme. The REMdb brings to the scientific community dealing with environmental radioactivity topics endless of research opportunities to exploit the near 200 millions of records received from MSs containing information of radioactivity levels in milk, water, air and mixed diet. The REM action was created shortly after Chernobyl crisis to support the EC in its responsibilities in providing qualified information to the European Parliament and the MSs on the levels of radioactive contamination of the various compartments of the environment (air, water, soil). Hence, the main line of REM’s activities concerns the improvement of procedures for the collection of environmental radioactivity concentrations for routine and emergency conditions, as well as making this information available to the general public. In this way, REM ensures the availability of tools for the inter-communication and access of users from the Member States and the other European countries to this information. Specific attention is given to further integrate the new MSs with the existing information exchange systems and to assist Candidate Countries in fulfilling these obligations in view of their membership of the EU. Article 36 of the EURATOM treaty requires the competent authorities of each MS to provide regularly the environmental radioactivity monitoring data resulting from their Article 35 obligations to the EC in order to keep EC informed on the levels of radioactivity in the environment (air, water, milk and mixed diet) which could affect population. The REMdb has mainly two objectives: to keep a historical record of the radiological accidents for further scientific study, and to collect the environmental radioactivity data gathered through the national environmental monitoring programs of the MSs to prepare the comprehensive annual monitoring reports (MR). The JRC continues his activity of collecting, assembling, analyzing and providing this information to public and MSs even during emergency situations. In addition, there is a growing concern with the general public about the radioactivity levels in the terrestrial and marine environment, as well about the potential risk of future nuclear accidents. To this context, a clear and transparent communication with the public is needed. EURDEP (European Radiological Data Exchange Platform) is both a standard format for radiological data and a network for the exchange of automatic monitoring data. The latest release of the format is version 2.0, which is in use since the beginning of 2002.

Keywords: environmental radioactivity, Euratom, monitoring report, REMdb

Procedia PDF Downloads 443
578 Crop Leaf Area Index (LAI) Inversion and Scale Effect Analysis from Unmanned Aerial Vehicle (UAV)-Based Hyperspectral Data

Authors: Xiaohua Zhu, Lingling Ma, Yongguang Zhao

Abstract:

Leaf Area Index (LAI) is a key structural characteristic of crops and plays a significant role in precision agricultural management and farmland ecosystem modeling. However, LAI retrieved from different resolution data contain a scaling bias due to the spatial heterogeneity and model non-linearity, that is, there is scale effect during multi-scale LAI estimate. In this article, a typical farmland in semi-arid regions of Chinese Inner Mongolia is taken as the study area, based on the combination of PROSPECT model and SAIL model, a multiple dimensional Look-Up-Table (LUT) is generated for multiple crops LAI estimation from unmanned aerial vehicle (UAV) hyperspectral data. Based on Taylor expansion method and computational geometry model, a scale transfer model considering both difference between inter- and intra-class is constructed for scale effect analysis of LAI inversion over inhomogeneous surface. The results indicate that, (1) the LUT method based on classification and parameter sensitive analysis is useful for LAI retrieval of corn, potato, sunflower and melon on the typical farmland, with correlation coefficient R2 of 0.82 and root mean square error RMSE of 0.43m2/m-2. (2) The scale effect of LAI is becoming obvious with the decrease of image resolution, and maximum scale bias is more than 45%. (3) The scale effect of inter-classes is higher than that of intra-class, which can be corrected efficiently by the scale transfer model established based Taylor expansion and Computational geometry. After corrected, the maximum scale bias can be reduced to 1.2%.

Keywords: leaf area index (LAI), scale effect, UAV-based hyperspectral data, look-up-table (LUT), remote sensing

Procedia PDF Downloads 440
577 Experimental Study of Unconfined and Confined Isothermal Swirling Jets

Authors: Rohit Sharma, Fabio Cozzi

Abstract:

A 3C-2D PIV technique was applied to investigate the swirling flow generated by an axial plus tangential type swirl generator. This work is focused on the near-exit region of an isothermal swirling jet to characterize the effect of swirl on the flow field and to identify the large coherent structures both in unconfined and confined conditions for geometrical swirl number, Sg = 4.6. Effects of the Reynolds number on the flow structure were also studied. The experimental results show significant effects of the confinement on the mean velocity fields and its fluctuations. The size of the recirculation zone was significantly enlarged upon confinement compared to the free swirling jet. Increasing in the Reynolds number further enhanced the recirculation zone. The frequency characteristics have been measured with a capacitive microphone which indicates the presence of periodic oscillation related to the existence of precessing vortex core, PVC. Proper orthogonal decomposition of the jet velocity field was carried out, enabling the identification of coherent structures. The time coefficients of the first two most energetic POD modes were used to reconstruct the phase-averaged velocity field of the oscillatory motion in the swirling flow. The instantaneous minima of negative swirl strength values calculated from the instantaneous velocity field revealed the presence of two helical structures located in the inner and outer shear layers and this structure fade out at an axial location of approximately z/D = 1.5 for unconfined case and z/D = 1.2 for confined case. By phase averaging the instantaneous swirling strength maps, the 3D helical vortex structure was reconstructed.

Keywords: acoustic probes, 3C-2D particle image velocimetry (PIV), precessing vortex core (PVC), recirculation zone (RZ)

Procedia PDF Downloads 233