Search results for: diver classification
433 The Nimbārka School of Vedānta and the Indian Classical Dance: The Philosophical Relevance through Rasa Theory
Authors: Shubham Arora
Abstract:
This paper illustrates a relationship between the Dvaitādvaita (dualistic non-dualistic) doctrine of Nimbārka school of Vedānta and philosophy of Indian classical dance, through the Rasa theory. There would be a separate focus on the philosophies of both the disciplines and then analyzing Rasa theory as a connexion between them. The paper presents ideas regarding the similarity between the Brahman and the dancer, manifestation of enacting character and the Jīva (soul), the existence of the phenomenal world and the imaginary world classification of rasa on the basis of three modes of nature, and the feelings and expressions depicting the Dvaita and Advaita. The reason behind choosing such a topic is an intention to explore the relativity of the Vedantic philosophy of this school in real manner. It is really important to study the practical implications and relevance of the doctrine with other disciplines for perceiving it cogently. In our daily lives, we use various forms of facial expressions and bodily gestures in order to communicate, along with the oral and written means of communication. What if, when gestures and expressions mingle with the music beats, in order to present an idea? Indian Classical dance is highly rich in expressing the emotions using extraordinary expressions, unconventional bodily gestures and mesmerizing music beats. Ancient scriptures like Nāṭyaśāstra of Bharata Muni and Abhinava Bhārati by Abhinavaguptā recount aesthetics in a well-defined and structured way of acting and dancing and also reveal the grammar of rasa theory. Indian Classical dance is not only for entertainment but it is deeply in contact with divinity. During the period of Bhakti movement in India, this art form was used as a means to narrate the vignettes from epics like Rāmāyana and Mahābhārata and Purānas. Even in present era, this art has a deep rooted philosophy within.Keywords: Advaita, Brahman, Dvaita, Jiva, Nimbarka, Rasa, Vedanta
Procedia PDF Downloads 299432 Effectiveness of Cold Calling on Students’ Behavior and Participation during Class Discussions: Punishment or Opportunity to Shine
Authors: Maimuna Akram, Khadija Zia, Sohaib Naseer
Abstract:
Pedagogical objectives and the nature of the course content may lead instructors to take varied approaches to selecting a student for the cold call, specifically in a studio setup where students work on different projects independently and show progress work time to time at scheduled critiques. Cold-calling often proves to be an effective tool in eliciting a response without enforcing judgment onto the recipients. While there is a mixed range of behavior exhibited by students who are cold-called, a classification of responses from anxiety-provoking to inspiring may be elicited; there is a need for a greater understanding of utilizing the exchanges in bringing about fruitful and engaging outcomes of studio discussions. This study aims to unravel the dimensions of utilizing the cold-call approach in a didactic exchange within studio pedagogy. A questionnaire survey was conducted in an undergraduate class at Arts and Design School. The impact of cold calling on students’ participation was determined through various parameters, including course choice, participation frequency, students’ comfortability, and teaching methodology. After analyzing the surveys, specific classroom teachers were interviewed to provide a qualitative perspective of the faculty. It was concluded that cold-calling increases students’ participation frequency and also increases preparation for class. Around 67% of students responded that teaching methods play an important role in learning activities and students’ participation during class discussions. 84% of participants agreed that cold calling is an effective way of learning. According to research, cold-calling can be done in large numbers without making students uncomfortable. As a result, the findings of this study support the use of this instructional method to encourage more students to participate in class discussions.Keywords: active learning, class discussion, class participation, cold calling, pedagogical methods, student engagement
Procedia PDF Downloads 37431 Lung Cancer Detection and Multi Level Classification Using Discrete Wavelet Transform Approach
Authors: V. Veeraprathap, G. S. Harish, G. Narendra Kumar
Abstract:
Uncontrolled growth of abnormal cells in the lung in the form of tumor can be either benign (non-cancerous) or malignant (cancerous). Patients with Lung Cancer (LC) have an average of five years life span expectancy provided diagnosis, detection and prediction, which reduces many treatment options to risk of invasive surgery increasing survival rate. Computed Tomography (CT), Positron Emission Tomography (PET), and Magnetic Resonance Imaging (MRI) for earlier detection of cancer are common. Gaussian filter along with median filter used for smoothing and noise removal, Histogram Equalization (HE) for image enhancement gives the best results without inviting further opinions. Lung cavities are extracted and the background portion other than two lung cavities is completely removed with right and left lungs segmented separately. Region properties measurements area, perimeter, diameter, centroid and eccentricity measured for the tumor segmented image, while texture is characterized by Gray-Level Co-occurrence Matrix (GLCM) functions, feature extraction provides Region of Interest (ROI) given as input to classifier. Two levels of classifications, K-Nearest Neighbor (KNN) is used for determining patient condition as normal or abnormal, while Artificial Neural Networks (ANN) is used for identifying the cancer stage is employed. Discrete Wavelet Transform (DWT) algorithm is used for the main feature extraction leading to best efficiency. The developed technology finds encouraging results for real time information and on line detection for future research.Keywords: artificial neural networks, ANN, discrete wavelet transform, DWT, gray-level co-occurrence matrix, GLCM, k-nearest neighbor, KNN, region of interest, ROI
Procedia PDF Downloads 153430 Impacts of Urbanization on Forest and Agriculture Areas in Savannakhet Province, Lao People's Democratic Republic
Authors: Chittana Phompila
Abstract:
The current increased population pushes increasing demands for natural resources and living space. In Laos, urban areas have been expanding rapidly in recent years. The rapid urbanization can have negative impacts on landscapes, including forest and agriculture lands. The primary objective of this research were to map current urban areas in a large city in Savannakhet province, in Laos, 2) to compare changes in urbanization between 1990 and 2018, and 3) to estimate forest and agriculture areas lost due to expansions of urban areas during the last over twenty years within study area. Landsat 8 data was used and existing GIS data was collected including spatial data on rivers, lakes, roads, vegetated areas and other land use/land covers). GIS data was obtained from the government sectors. Object based classification (OBC) approach was applied in ECognition for image processing and analysis of urban area using. Historical data from other Landsat instruments (Landsat 5 and 7) were used to allow us comparing changes in urbanization in 1990, 2000, 2010 and 2018 in this study area. Only three main land cover classes were focused and classified, namely forest, agriculture and urban areas. Change detection approach was applied to illustrate changes in built-up areas in these periods. Our study shows that the overall accuracy of map was 95% assessed, kappa~ 0.8. It is found that that there is an ineffective control over forest and land-use conversions from forests and agriculture to urban areas in many main cities across the province. A large area of agriculture and forest has been decreased due to this conversion. Uncontrolled urban expansion and inappropriate land use planning can lead to creating a pressure in our resource utilisation. As consequence, it can lead to food insecurity and national economic downturn in a long term.Keywords: urbanisation, forest cover, agriculture areas, Landsat 8 imagery
Procedia PDF Downloads 159429 Disclosure Extension of Oil and Gas Reserve Quantum
Authors: Ali Alsawayeh, Ibrahim Eldanfour
Abstract:
This paper examines the extent of disclosure of oil and gas reserve quantum in annual reports of international oil and gas exploration and production companies, particularly companies in untested international markets, such as Canada, the UK and the US, and seeks to determine the underlying factors that affect the level of disclosure on oil reserve quantum. The study is concerned with the usefulness of disclosure of oil and gas reserves quantum to investors and other users. Given the primacy of the annual report (10-k) as a source of supplemental reserves data about the company and as the channel through which companies disseminate information about their performance, the annual reports for one year (2009) were the central focus of the study. This comparative study seeks to establish whether differences exist between the sample companies, based on new disclosure requirements by the Securities and Exchange Commission (SEC) in respect of reserves classification and definition. The extent of disclosure of reserve is provided and compared among the selected companies. Statistical analysis is performed to determine whether any differences exist in the extent of disclosure of reserve under the determinant variables. This study shows that some factors would affect the extent of disclosure of reserve quantum in the above-mentioned countries, namely: company’s size, leverage and quality of auditor. Companies that provide reserves quantum in detail appear to display higher size. The findings also show that the level of leverage has affected companies’ reserves quantum disclosure. Indeed, companies that provide detailed reserves quantum disclosure tend to employ a ‘high-quality auditor’. In addition, the study found significant independent variable such as Profit Sharing Contracts (PSC). This factor could explain variations in the level of disclosure of oil reserve quantum between the contractor and host governments. The implementation of SEC oil and gas reporting requirements do not enhance companies’ valuation because the new rules are based only on past and present reserves information (proven reserves); hence, future valuation of oil and gas companies is missing for the market.Keywords: comparison, company characteristics, disclosure, reserve quantum, regulation
Procedia PDF Downloads 405428 Balanced Scorecard (BSC) Project : A Methodological Proposal for Decision Support in a Corporate Scenario
Authors: David de Oliveira Costa, Miguel Ângelo Lellis Moreira, Carlos Francisco Simões Gomes, Daniel Augusto de Moura Pereira, Marcos dos Santos
Abstract:
Strategic management is a fundamental process for global companies that intend to remain competitive in an increasingly dynamic and complex market. To do so, it is necessary to maintain alignment with their principles and values. The Balanced Scorecard (BSC) proposes to ensure that the overall business performance is based on different perspectives (financial, customer, internal processes, and learning and growth). However, relying solely on the BSC may not be enough to ensure the success of strategic management. It is essential that companies also evaluate and prioritize strategic projects that need to be implemented to ensure they are aligned with the business vision and contribute to achieving established goals and objectives. In this context, the proposition involves the incorporation of the SAPEVO-M multicriteria method to indicate the degree of relevance between different perspectives. Thus, the strategic objectives linked to these perspectives have greater weight in the classification of structural projects. Additionally, it is proposed to apply the concept of the Impact & Probability Matrix (I&PM) to structure and ensure that strategic projects are evaluated according to their relevance and impact on the business. By structuring the business's strategic management in this way, alignment and prioritization of projects and actions related to strategic planning are ensured. This ensures that resources are directed towards the most relevant and impactful initiatives. Therefore, the objective of this article is to present the proposal for integrating the BSC methodology, the SAPEVO-M multicriteria method, and the prioritization matrix to establish a concrete weighting of strategic planning and obtain coherence in defining strategic projects aligned with the business vision. This ensures a robust decision-making support process.Keywords: MCDA process, prioritization problematic, corporate strategy, multicriteria method
Procedia PDF Downloads 81427 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data
Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos
Abstract:
Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia
Procedia PDF Downloads 22426 Automatic Lexicon Generation for Domain Specific Dataset for Mining Public Opinion on China Pakistan Economic Corridor
Authors: Tayyaba Azim, Bibi Amina
Abstract:
The increase in the popularity of opinion mining with the rapid growth in the availability of social networks has attracted a lot of opportunities for research in the various domains of Sentiment Analysis and Natural Language Processing (NLP) using Artificial Intelligence approaches. The latest trend allows the public to actively use the internet for analyzing an individual’s opinion and explore the effectiveness of published facts. The main theme of this research is to account the public opinion on the most crucial and extensively discussed development projects, China Pakistan Economic Corridor (CPEC), considered as a game changer due to its promise of bringing economic prosperity to the region. So far, to the best of our knowledge, the theme of CPEC has not been analyzed for sentiment determination through the ML approach. This research aims to demonstrate the use of ML approaches to spontaneously analyze the public sentiment on Twitter tweets particularly about CPEC. Support Vector Machine SVM is used for classification task classifying tweets into positive, negative and neutral classes. Word2vec and TF-IDF features are used with the SVM model, a comparison of the trained model on manually labelled tweets and automatically generated lexicon is performed. The contributions of this work are: Development of a sentiment analysis system for public tweets on CPEC subject, construction of an automatic generation of the lexicon of public tweets on CPEC, different themes are identified among tweets and sentiments are assigned to each theme. It is worth noting that the applications of web mining that empower e-democracy by improving political transparency and public participation in decision making via social media have not been explored and practised in Pakistan region on CPEC yet.Keywords: machine learning, natural language processing, sentiment analysis, support vector machine, Word2vec
Procedia PDF Downloads 148425 The Types of Annuities with Flexible Premium
Authors: Deniz Ünal Özpalamutcu, Burcu Altman
Abstract:
Actuaria uses mathematics, statistic and financial information when analyzing the financial impacts of uncertainties, risks, insurance and pension related issues. In other words, it deals with the likelihood of potential risks, their financial impacts and especially the financial measures. Handling these measures require some long-term payment and investments. So, it is obvious it is inevitable to plan the periodic payments with equal time intervals considering also the changing value of money over time. These series of payment made specific intervals of time is called annuity or rant. In literature, rants are classified based on start and end dates, start times, payments times, payments amount or frequency. Classification of rants based on payment amounts changes based on the constant, descending or ascending payment methods. The literature about handling the annuity is very limited. Yet in a daily life, especially in today’s world where the economic issues gained a prominence, it is very crucial to use the variable annuity method in line with the demands of the customers. In this study, the types of annuities with flexible payment are discussed. In other words, we focus on calculating payment amount of a period by adding a certain percentage of previous period payment was studied. While studying this problem, formulas were created considering both start and end period payments for cash value and accumulated. Also increase of each period payment by r interest rate each period payments calculated with previous periods increases. And the problem of annuities (rants) of which each period payment increased with previous periods’ increase by r interest rate has been analyzed. Cash value and accumulated value calculation of this problem were studied separately based on the period start/end and their relations were expressed by formulas.Keywords: actuaria, annuity, flexible payment, rant
Procedia PDF Downloads 221424 A Study on Exploring Employees' Well-Being in Gaming Workplaces Prior to and after the Chinese Government Crackdowns on Corruption
Authors: Ying Chuan Wang, Zhang Tao
Abstract:
The aim of this article intends to explore the differences of well-being of employees in casino hotels before and after the Chinese government began to fight corruption. This researcher also attempted to find out the relationship between work pressure and well-being of employees in gambling workplaces before and after the Chinese government crackdowns the corruption. The category of well-being including life well-being, workplace well-being, and psychological well-being was included for analyzing well-being of employees in gaming workplaces. In addition, the psychological pressure classification was applied into this study and the Job Content Questionnaire (JCQ) would be adopted on investigating employees’ work pressure in terms of decision latitude, psychological demands, and workplace support. This study is a quantitative approach research and was conducted in March 2017. A purposive sampling was used in this study. A total of valid 339 responses were collected and the participants were casino hotel employees. The findings showed that decision latitude was significantly different prior to and after Chinese government crackdowns on corruption. Moreover, workplace support was strongly significantly related to employees’ well-being before Chinese government crackdowns. Decision latitude was strongly significantly related to employees’ well-being after Chinese government crackdowns. The findings suggest that employees’ work pressure affects their well being. In particular, because of workplace supports, it may alleviate employees’ work pressure and affect their perceptions of well-being but only prior to fighting the crackdowns. Importantly, decision latitude has become an essential factor affecting their well-being after the crackdown. It is finally hoped that the findings of this study provide suggestion to the managerial levels of hospitality industries. It is important to enhance employees’ decision latitude. Offering training courses to equip employees’ skills could be a possible way to reduce work pressure. In addition, establishing career path for the employees to pursuit is essential for their self-development and the improvement of well being. This would be crucial for casino hotels’ sustainable development and strengthening their competitiveness.Keywords: well-being, work pressure, Casino hotels’ employees, gaming workplace
Procedia PDF Downloads 224423 Analysis of Buddhist Rock Carvings in Diamer Basha Dam Reservoir Area, Gilgit-Baltistan, Pakistan
Authors: Abdul Ghani Khan
Abstract:
This paper focuses on the Buddhist rock carvings in the Diamer-Basha reservoir area, Gilgit-Baltistan, which is perhaps the largest rock art province of the world. The study region has thousands of rock carvings, particularly of the stupa carvings, engraved by artists, devotees or pilgrims, merchants have left their marks in the landscape or for the propagation of Buddhism. The Pak-German Archaeological Mission prepared, documented, and published the extensive catalogues of these carvings. Though, to date, very little systematic or statistically driven analysis was undertaken for in-depth understandings of the Buddhist rock carving tradition of the study region. This paper had made an attempt to examine stupa carvings and their constituent parts from the five selected sites, namely Oshibat, Shing Nala, Gichi Nala, Dadam Das, and Chilas Bridge. The statistical analyses and classification of the stupa carvings and their chronological contexts were carried out with the help of modern scientific tools such as STATA, FileMaker Pro, and MapSource softwares. The study had found that the tradition of stupa carvings on the surfaces of the rocks at the five selected sites continued for around 900 years, from the 1st century BCE to 8th century CE. There is a variation within the chronological settings of each of selected sites, possibly impacted by their utilization within particular landscapes, such as political (for example, change in political administrations or warfare) landscapes and geographical (for example, shifting of routes). The longer existence of the stupa carvings' tradition at these specific locations also indicates their central position on the trade and communication routes, and these were possibly also linked with religious ideologies within their particular times. The analyses of the different architectural elements of stupa carvings in the study area show that this tradition had structural similarities and differences in temporal and spatial contexts.Keywords: rock carvings, stupa, stupa carvings, Buddhism, Pak-German archaeological mission
Procedia PDF Downloads 224422 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia
Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman
Abstract:
Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh
Procedia PDF Downloads 226421 Land Use Land Cover Changes in Response to Urban Sprawl within North-West Anatolia, Turkey
Authors: Melis Inalpulat, Levent Genc
Abstract:
In the present study, an attempt was made to state the Land Use Land Cover (LULC) transformation over three decades around the urban regions of Balıkesir, Bursa, and Çanakkale provincial centers (PCs) in Turkey. Landsat imageries acquired in 1984, 1999 and 2014 were used to determine the LULC change. Images were classified using the supervised classification technique and five main LULC classes were considered including forest (F), agricultural land (A), residential area (urban) - bare soil (R-B), water surface (W), and other (O). Change detection analyses were conducted for 1984-1999 and 1999-2014, and the results were evaluated. Conversions of LULC types to R-B class were investigated. In addition, population changes (1985-2014) were assessed depending on census data, the relations between population and the urban areas were stated, and future populations and urban area needs were forecasted for 2030. The results of LULC analysis indicated that urban areas, which are covered under R-B class, were expanded in all PCs. During 1984-1999 R-B class within Balıkesir, Bursa and Çanakkale PCs were found to have increased by 7.1%, 8.4%, and 2.9%, respectively. The trend continued in the 1999-2014 term and the increment percentages reached to 15.7%, 15.5%, and 10.2% at the end of 30-year period (1984-2014). Furthermore, since A class in all provinces was found to be the principal contributor for the R-B class, urban sprawl lead to the loss of agricultural lands. Moreover, the areas of R-B classes were highly correlated with population within all PCs (R2>0.992). Depending on this situation, both future populations and R-B class areas were forecasted. The estimated values of increase in the R-B class areas for Balıkesir, Bursa, and Çanakkale PCs were 1,586 ha, 7,999 ha and 854 ha, respectively. Due to this fact, the forecasted values for 2,030 are 7,838 ha, 27,866, and 2,486 ha for Balıkesir, Bursa, and Çanakkale, and thus, 7.7%, 8.2%, and 9.7% more R-B class areas are expected to locate in PCs in respect to the same order.Keywords: landsat, LULC change, population, urban sprawl
Procedia PDF Downloads 262420 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 135419 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models
Authors: R. Hellmuth
Abstract:
The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.Keywords: building information modeling, digital factory model, factory planning, maintenance
Procedia PDF Downloads 110418 Vibro-Tactile Equalizer for Musical Energy-Valence Categorization
Authors: Dhanya Nair, Nicholas Mirchandani
Abstract:
Musical haptic systems can enhance a listener’s musical experience while providing an alternative platform for the hearing impaired to experience music. Current music tactile technologies focus on representing tactile metronomes to synchronize performers or encoding musical notes into distinguishable (albeit distracting) tactile patterns. There is growing interest in the development of musical haptic systems to augment the auditory experience, although the haptic-music relationship is still not well understood. This paper represents a tactile music interface that provides vibrations to multiple fingertips in synchronicity with auditory music. Like an audio equalizer, different frequency bands are filtered out, and the power in each frequency band is computed and converted to a corresponding vibrational strength. These vibrations are felt on different fingertips, each corresponding to a different frequency band. Songs with music from different spectrums, as classified by their energy and valence, were used to test the effectiveness of the system and to understand the relationship between music and tactile sensations. Three participants were trained on one song categorized as sad (low energy and low valence score) and one song categorized as happy (high energy and high valence score). They were trained both with and without auditory feedback (listening to the song while experiencing the tactile music on their fingertips and then experiencing the vibrations alone without the music). The participants were then tested on three songs from both categories, without any auditory feedback, and were asked to classify the tactile vibrations they felt into either category. The participants were blinded to the songs being tested and were not provided any feedback on the accuracy of their classification. These participants were able to classify the music with 100% accuracy. Although the songs tested were on two opposite spectrums (sad/happy), the preliminary results show the potential of utilizing a vibrotactile equalizer, like the one presented, for augmenting musical experience while furthering the current understanding of music tactile relationship.Keywords: haptic music relationship, tactile equalizer, tactile music, vibrations and mood
Procedia PDF Downloads 181417 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection
Authors: Hamidullah Binol, Abdullah Bal
Abstract:
Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods
Procedia PDF Downloads 431416 Value Index, a Novel Decision Making Approach for Waste Load Allocation
Authors: E. Feizi Ashtiani, S. Jamshidi, M.H Niksokhan, A. Feizi Ashtiani
Abstract:
Waste load allocation (WLA) policies may use multi-objective optimization methods to find the most appropriate and sustainable solutions. These usually intend to simultaneously minimize two criteria, total abatement costs (TC) and environmental violations (EV). If other criteria, such as inequity, need for minimization as well, it requires introducing more binary optimizations through different scenarios. In order to reduce the calculation steps, this study presents value index as an innovative decision making approach. Since the value index contains both the environmental violation and treatment costs, it can be maximized simultaneously with the equity index. It implies that the definition of different scenarios for environmental violations is no longer required. Furthermore, the solution is not necessarily the point with minimized total costs or environmental violations. This idea is testified for Haraz River, in north of Iran. Here, the dissolved oxygen (DO) level of river is simulated by Streeter-Phelps equation in MATLAB software. The WLA is determined for fish farms using multi-objective particle swarm optimization (MOPSO) in two scenarios. At first, the trade-off curves of TC-EV and TC-Inequity are plotted separately as the conventional approach. In the second, the Value-Equity curve is derived. The comparative results show that the solutions are in a similar range of inequity with lower total costs. This is due to the freedom of environmental violation attained in value index. As a result, the conventional approach can well be replaced by the value index particularly for problems optimizing these objectives. This reduces the process to achieve the best solutions and may find better classification for scenario definition. It is also concluded that decision makers are better to focus on value index and weighting its contents to find the most sustainable alternatives based on their requirements.Keywords: waste load allocation (WLA), value index, multi objective particle swarm optimization (MOPSO), Haraz River, equity
Procedia PDF Downloads 422415 Improvement of the Reliability and the Availability of a Production System
Authors: Lakhoua Najeh
Abstract:
Aims of the work: The aim of this paper is to improve the reliability and the availability of a Packer production line of cigarettes based on two methods: The SADT method (Structured Analysis Design Technique) and the FMECA approach (Failure Mode Effects and Critically Analysis). The first method enables us to describe the functionality of the Packer production line of cigarettes and the second method enables us to establish an FMECA analysis. Methods: The methodology adopted in order to contribute to the improvement of the reliability and the availability of a Packer production line of cigarettes has been proposed in this paper, and it is based on the use of Structured Analysis Design Technique (SADT) and Failure mode, effects, and criticality analysis (FMECA) methods. This methodology consists of using a diagnosis of the existing of all of the equipment of a production line of a factory in order to determine the most critical machine. In fact, we use, on the one hand, a functional analysis based on the SADT method of the production line and on the other hand, a diagnosis and classification of mechanical and electrical failures of the line production by their criticality analysis based on the FMECA approach. Results: Based on the methodology adopted in this paper, the results are the creation and the launch of a preventive maintenance plan. They contain the different elements of a Packer production line of cigarettes; the list of the intervention preventive activities and their period of realization. Conclusion: The diagnosis of the existing state helped us to found that the machine of cigarettes used in the Packer production line of cigarettes is the most critical machine in the factory. Then this enables us in the one hand, to describe the functionality of the production line of cigarettes by SADT method and on the other hand, to study the FMECA machine in order to improve the availability and the performance of this machine.Keywords: production system, diagnosis, SADT method, FMECA method
Procedia PDF Downloads 143414 Quality Assessment of New Zealand Mānuka Honeys Using Hyperspectral Imaging Combined with Deep 1D-Convolutional Neural Networks
Authors: Hien Thi Dieu Truong, Mahmoud Al-Sarayreh, Pullanagari Reddy, Marlon M. Reis, Richard Archer
Abstract:
New Zealand mānuka honey is a honeybee product derived mainly from Leptospermum scoparium nectar. The potent antibacterial activity of mānuka honey derives principally from methylglyoxal (MGO), in addition to the hydrogen peroxide and other lesser activities present in all honey. MGO is formed from dihydroxyacetone (DHA) unique to L. scoparium nectar. Mānuka honey also has an idiosyncratic phenolic profile that is useful as a chemical maker. Authentic mānuka honey is highly valuable, but almost all honey is formed from natural mixtures of nectars harvested by a hive over a time period. Once diluted by other nectars, mānuka honey irrevocably loses value. We aimed to apply hyperspectral imaging to honey frames before bulk extraction to minimise the dilution of genuine mānuka by other honey and ensure authenticity at the source. This technology is non-destructive and suitable for an industrial setting. Chemometrics using linear Partial Least Squares (PLS) and Support Vector Machine (SVM) showed limited efficacy in interpreting chemical footprints due to large non-linear relationships between predictor and predictand in a large sample set, likely due to honey quality variability across geographic regions. Therefore, an advanced modelling approach, one-dimensional convolutional neural networks (1D-CNN), was investigated for analysing hyperspectral data for extraction of biochemical information from honey. The 1D-CNN model showed superior prediction of honey quality (R² = 0.73, RMSE = 2.346, RPD= 2.56) to PLS (R² = 0.66, RMSE = 2.607, RPD= 1.91) and SVM (R² = 0.67, RMSE = 2.559, RPD=1.98). Classification of mono-floral manuka honey from multi-floral and non-manuka honey exceeded 90% accuracy for all models tried. Overall, this study reveals the potential of HSI and deep learning modelling for automating the evaluation of honey quality in frames.Keywords: mānuka honey, quality, purity, potency, deep learning, 1D-CNN, chemometrics
Procedia PDF Downloads 139413 Assessment of Waste Management Practices in Bahrain
Authors: T. Radu, R. Sreenivas, H. Albuflasa, A. Mustafa Khan, W. Aloqab
Abstract:
The Kingdom of Bahrain, a small island country in the Gulf region, is experiencing fast economic growth resulting in a sharp increase in population and greater than ever amounts of waste being produced. However, waste management in the country is still very basic, with landfilling being the most popular option. Recycling is still a scarce practice, with small recycling businesses and initiatives emerging in recent years. This scenario is typical for other countries in the region, with similar amounts of per capita waste being produced. In this paper, we are reviewing current waste management practices in Bahrain by collecting data published by the Government and various authors, and by visiting the country’s only landfill site, Askar. In addition, we have performed a survey of the residents to learn more about the awareness and attitudes towards sustainable waste management strategies. A review of the available data on waste management indicates that the Askar landfill site is nearing its capacity. The site uses open tipping as the method of disposal. The highest percentage of disposed waste comes from the building sector (38.4%), followed by domestic (27.5%) and commercial waste (17.9%). Disposal monitoring and recording are often based on estimates of weight and without proper characterization/classification of received waste. Besides, there is a need for assessment of the environmental impact of the site with systematic monitoring of pollutants in the area and their potential spreading to the surrounding land, groundwater, and air. The results of the survey indicate low awareness of what happens with the collected waste in the country. However, the respondents have shown support for future waste reduction and recycling initiatives. This implies that the education of local communities would be very beneficial for such governmental initiatives, securing greater participation. Raising awareness of issues surrounding recycling and waste management and systematic effort to divert waste from landfills are the first steps towards securing sustainable waste management in the Kingdom of Bahrain.Keywords: landfill, municipal solid waste, survey, waste management
Procedia PDF Downloads 158412 Innovative Screening Tool Based on Physical Properties of Blood
Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan
Abstract:
This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability
Procedia PDF Downloads 376411 Examining Relationship between Resource-Curse and Under-Five Mortality in Resource-Rich Countries
Authors: Aytakin Huseynli
Abstract:
The paper reports findings of the study which examined under-five mortality rate among resource-rich countries. Typically when countries obtain wealth citizens gain increased wellbeing. Societies with new wealth create equal opportunities for everyone including vulnerable groups. But scholars claim that this is not the case for developing resource-rich countries and natural resources become the curse for them rather than the blessing. Spillovers from natural resource curse affect the social wellbeing of vulnerable people negatively. They get excluded from the mainstream society, and their situation becomes tangible. In order to test this hypothesis, the study compared under-5 mortality rate among resource-rich countries by using independent sample one-way ANOVA. The data on under-five mortality rate came from the World Bank. The natural resources for this study are oil, gas and minerals. The list of 67 resource-rich countries was taken from Natural Resource Governance Institute. The sample size was categorized and 4 groups were created such as low, low-middle, upper middle and high-income countries based on income classification of the World Bank. Results revealed that there was a significant difference in the scores for low, middle, upper-middle and high-income countries in under-five mortality rate (F(3(29.01)=33.70, p=.000). To find out the difference among income groups, the Games-Howell test was performed and it was found that infant mortality was an issue for low, middle and upper middle countries but not for high-income countries. Results of this study are in agreement with previous research on resource curse and negative effects of resource-based development. Policy implications of the study for social workers, policy makers, academicians and social development specialists are to raise and discuss issues of marginalization and exclusion of vulnerable groups in developing resource-rich countries and suggest interventions for avoiding them.Keywords: children, natural resource, extractive industries, resource-based development, vulnerable groups
Procedia PDF Downloads 254410 Selection of Optimal Reduced Feature Sets of Brain Signal Analysis Using Heuristically Optimized Deep Autoencoder
Authors: Souvik Phadikar, Nidul Sinha, Rajdeep Ghosh
Abstract:
In brainwaves research using electroencephalogram (EEG) signals, finding the most relevant and effective feature set for identification of activities in the human brain is a big challenge till today because of the random nature of the signals. The feature extraction method is a key issue to solve this problem. Finding those features that prove to give distinctive pictures for different activities and similar for the same activities is very difficult, especially for the number of activities. The performance of a classifier accuracy depends on this quality of feature set. Further, more number of features result in high computational complexity and less number of features compromise with the lower performance. In this paper, a novel idea of the selection of optimal feature set using a heuristically optimized deep autoencoder is presented. Using various feature extraction methods, a vast number of features are extracted from the EEG signals and fed to the autoencoder deep neural network. The autoencoder encodes the input features into a small set of codes. To avoid the gradient vanish problem and normalization of the dataset, a meta-heuristic search algorithm is used to minimize the mean square error (MSE) between encoder input and decoder output. To reduce the feature set into a smaller one, 4 hidden layers are considered in the autoencoder network; hence it is called Heuristically Optimized Deep Autoencoder (HO-DAE). In this method, no features are rejected; all the features are combined into the response of responses of the hidden layer. The results reveal that higher accuracy can be achieved using optimal reduced features. The proposed HO-DAE is also compared with the regular autoencoder to test the performance of both. The performance of the proposed method is validated and compared with the other two methods recently reported in the literature, which reveals that the proposed method is far better than the other two methods in terms of classification accuracy.Keywords: autoencoder, brainwave signal analysis, electroencephalogram, feature extraction, feature selection, optimization
Procedia PDF Downloads 114409 Predictive Spectral Lithological Mapping, Geomorphology and Geospatial Correlation of Structural Lineaments in Bornu Basin, Northeast Nigeria
Authors: Aminu Abdullahi Isyaku
Abstract:
Semi-arid Bornu basin in northeast Nigeria is characterised with flat topography, thick cover sediments and lack of continuous bedrock outcrops discernible for field geology. This paper presents the methodology for the characterisation of neotectonic surface structures and surface lithology in the north-eastern Bornu basin in northeast Nigeria as an alternative approach to field geological mapping using free multispectral Landsat 7 ETM+, SRTM DEM and ASAR Earth Observation datasets. Spectral lithological mapping herein developed utilised spectral discrimination of the surface features identified on Landsat 7 ETM+ images to infer on the lithology using four steps including; computations of band combination images; band ratio images; supervised image classification and inferences of the lithological compositions. Two complementary approaches to lineament mapping are carried out in this study involving manual digitization and automatic lineament extraction to validate the structural lineaments extracted from the Landsat 7 ETM+ image mosaic covering the study. A comparison between the mapped surface lineaments and lineament zones show good geospatial correlation and identified the predominant NE-SW and NW-SE structural trends in the basin. Topographic profiles across different parts of the Bama Beach Ridge palaeoshorelines in the basin appear to show different elevations across the feature. It is determined that most of the drainage systems in the northeastern Bornu basin are structurally controlled with drainage lines terminating against the paleo-lake border and emptying into the Lake Chad mainly arising from the extensive topographic high-stand Bama Beach Ridge palaeoshoreline.Keywords: Bornu Basin, lineaments, spectral lithology, tectonics
Procedia PDF Downloads 139408 Monitoring Deforestation Using Remote Sensing And GIS
Authors: Tejaswi Agarwal, Amritansh Agarwal
Abstract:
Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection
Procedia PDF Downloads 1204407 Extraction of Forest Plantation Resources in Selected Forest of San Manuel, Pangasinan, Philippines Using LiDAR Data for Forest Status Assessment
Authors: Mark Joseph Quinto, Roan Beronilla, Guiller Damian, Eliza Camaso, Ronaldo Alberto
Abstract:
Forest inventories are essential to assess the composition, structure and distribution of forest vegetation that can be used as baseline information for management decisions. Classical forest inventory is labor intensive and time-consuming and sometimes even dangerous. The use of Light Detection and Ranging (LiDAR) in forest inventory would improve and overcome these restrictions. This study was conducted to determine the possibility of using LiDAR derived data in extracting high accuracy forest biophysical parameters and as a non-destructive method for forest status analysis of San Manual, Pangasinan. Forest resources extraction was carried out using LAS tools, GIS, Envi and .bat scripts with the available LiDAR data. The process includes the generation of derivatives such as Digital Terrain Model (DTM), Canopy Height Model (CHM) and Canopy Cover Model (CCM) in .bat scripts followed by the generation of 17 composite bands to be used in the extraction of forest classification covers using ENVI 4.8 and GIS software. The Diameter in Breast Height (DBH), Above Ground Biomass (AGB) and Carbon Stock (CS) were estimated for each classified forest cover and Tree Count Extraction was carried out using GIS. Subsequently, field validation was conducted for accuracy assessment. Results showed that the forest of San Manuel has 73% Forest Cover, which is relatively much higher as compared to the 10% canopy cover requirement. On the extracted canopy height, 80% of the tree’s height ranges from 12 m to 17 m. CS of the three forest covers based on the AGB were: 20819.59 kg/20x20 m for closed broadleaf, 8609.82 kg/20x20 m for broadleaf plantation and 15545.57 kg/20x20m for open broadleaf. Average tree counts for the tree forest plantation was 413 trees/ha. As such, the forest of San Manuel has high percent forest cover and high CS.Keywords: carbon stock, forest inventory, LiDAR, tree count
Procedia PDF Downloads 389406 Analysis of Human Toxicity Potential of Major Building Material Production Stage Using Life Cycle Assessment
Authors: Rakhyun Kim, Sungho Tae
Abstract:
Global environmental issues such as abnormal weathers due to global warming, resource depletion, and ecosystem distortions have been escalating due to rapid increase of population growth, and expansion of industrial and economic development. Accordingly, initiatives have been implemented by many countries to protect the environment through indirect regulation methods such as Environmental Product Declaration (EPD), in addition to direct regulations such as various emission standards. Following this trend, life cycle assessment (LCA) techniques that provide quantitative environmental information, such as Human Toxicity Potential (HTP), for buildings are being developed in the construction industry. However, at present, the studies on the environmental database of building materials are not sufficient to provide this support adequately. The purpose of this study is to analysis human toxicity potential of major building material production stage using life cycle assessment. For this purpose, the theoretical consideration of the life cycle assessment and environmental impact category was performed and the direction of the study was set up. That is, the major material in the global warming potential view was drawn against the building and life cycle inventory database was selected. The classification was performed about 17 kinds of substance and impact index, such as human toxicity potential, that it specifies in CML2001. The environmental impact of analysis human toxicity potential for the building material production stage was calculated through the characterization. Meanwhile, the environmental impact of building material in the same category was analyze based on the characterization impact which was calculated in this study. In this study, establishment of environmental impact coefficients of major building material by complying with ISO 14040. Through this, it is believed to effectively support the decisions of stakeholders to improve the environmental performance of buildings and provide a basis for voluntary participation of architects in environment consideration activities.Keywords: human toxicity potential, major building material, life cycle assessment, production stage
Procedia PDF Downloads 139405 Recycling of End of Life Concrete Based on C2CA Method
Authors: Somayeh Lotfi, Manuel Eggimann, Eckhard Wagner, Radosław Mróz, Jan Deja
Abstract:
One of the main environmental challenges in the construction industry is a strong social force to decrease the bulk transport of the building materials in urban environments. Considering this fact, applying more in-situ recycling technologies for Construction and Demolition Waste (CDW) is an urgent need. The European C2CA project develops a novel concrete recycling technology that can be performed purely mechanically and in situ. The technology consists of a combination of smart demolition, gentle grinding of the crushed concrete in an autogenous mill, and a novel dry classification technology called ADR to remove the fines. The feasibility of this recycling process was examined in demonstration projects involving in total 20,000 tons of End of Life (EOL) concrete from two office towers in Groningen, The Netherlands. This paper concentrates on the second demonstration project of C2CA, where EOL concrete was recycled on an industrial site. After recycling, the properties of the produced Recycled Aggregate (RA) were investigated, and results are presented. An experimental study was carried out on mechanical and durability properties of produced Recycled Aggregate Concrete (RAC) compared to those of the Natural Aggregate Concrete (NAC). The aim was to understand the importance of RA substitution, w/c ratio and type of cement to the properties of RAC. In this regard, two series of reference concrete with strength classes of C25/30 and C45/55 were produced using natural coarse aggregates (rounded and crushed) and natural sand. The RAC series were created by replacing parts of the natural aggregate, resulting in series of concrete with 0%, 20%, 50% and 100% of RA. Results show that the concrete mix design and type of cement have a decisive effect on the properties of RAC. On the other hand, the substitution of RA even at a high percentage replacement level has a minor and manageable impact on the performance of RAC. This result is a good indication towards the feasibility of using RA in structural concrete by modifying the mix design and using a proper type of cement.Keywords: C2CA, ADR, concrete recycling, recycled aggregate, durability
Procedia PDF Downloads 391404 NDVI as a Measure of Change in Forest Biomass
Authors: Amritansh Agarwal, Tejaswi Agarwal
Abstract:
Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.Keywords: remote sensing, deforestation, supervised classification, NDVI change detection
Procedia PDF Downloads 402