Search results for: spatial data mining
24887 Hydrologic Balance and Surface Water Resources of the Cheliff-Zahrez Basin
Authors: Mehaiguene Madjid, Touhari Fadhila, Meddi Mohamed
Abstract:
The Cheliff basin offers a good hydrological example for the possibility of studying the problem which elucidated in the future, because of the unclearity in several aspects and hydraulic installation. Thus, our study of the Cheliff basin is divided into two principal parts: The spatial evaluation of the precipitation: also, the understanding of the modes of the reconstitution of the resource in water supposes a good knowledge of the structuring of the precipitation fields in the studied space. In the goal of a good knowledge of revitalizes them in water and their management integrated one judged necessary to establish a precipitation card of the Cheliff basin for a good understanding of the evolution of the resource in water in the basin and that goes will serve as basis for all study of hydraulic planning in the Cheliff basin. Then, the establishment of the precipitation card of the Cheliff basin answered a direct need of setting to the disposition of the researchers for the region and a document of reference that will be completed therefore and actualized. The hydrological study, based on the statistical hydrometric data processing will lead us to specify the hydrological terms of the assessment hydrological and to clarify the fundamental aspects of the annual flow, seasonal, extreme and thus of their variability and resources surface water.Keywords: hydrological assessment, surface water resources, Cheliff, Algeria
Procedia PDF Downloads 30224886 The Hubs of Transformation Dictated by the Innovation Wave: Boston as a Case Study. Exploring How Design is Emerging as an Essential Feature in the Process of Laboratorisation of Cities
Authors: Luana Parisi, Sohrab Donyavi
Abstract:
Cities have become the nodes of global networks, standing at the intersection points of the flows of capital, goods, workers, businesses and travellers, making them the spots where innovation, progress and economic development occur. The primary challenge for them is to create the most fertile ecosystems for triggering innovation activities. Design emerges as an essential feature in this process of laboratorisation of cities. This paper aims at exploring the spatial hubs of transformation within the knowledge economy, providing an overview of the current models of innovation spaces, before focusing on the innovation district of one of the cities that are riding the innovation wave, namely, Boston, USA. Useful lessons will be drawn from the case study of the innovation district in Boston, allowing to define precious tools for policymakers, in the form of a range of factors that define the broad strategy able to implement the model successfully. A mixed methodology is implemented, including information from observations, exploratory interviews to key stakeholders and on-desk data.Keywords: Innovation District, innovation ecosystem, economic development, urban regeneration
Procedia PDF Downloads 12324885 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network
Authors: Abdulaziz Alsadhan, Naveed Khan
Abstract:
In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)
Procedia PDF Downloads 36424884 Analytical Model of Locomotion of a Thin-Film Piezoelectric 2D Soft Robot Including Gravity Effects
Authors: Zhiwu Zheng, Prakhar Kumar, Sigurd Wagner, Naveen Verma, James C. Sturm
Abstract:
Soft robots have drawn great interest recently due to a rich range of possible shapes and motions they can take on to address new applications, compared to traditional rigid robots. Large-area electronics (LAE) provides a unique platform for creating soft robots by leveraging thin-film technology to enable the integration of a large number of actuators, sensors, and control circuits on flexible sheets. However, the rich shapes and motions possible, especially when interacting with complex environments, pose significant challenges to forming well-generalized and robust models necessary for robot design and control. In this work, we describe an analytical model for predicting the shape and locomotion of a flexible (steel-foil-based) piezoelectric-actuated 2D robot based on Euler-Bernoulli beam theory. It is nominally (unpowered) lying flat on the ground, and when powered, its shape is controlled by an array of piezoelectric thin-film actuators. Key features of the models are its ability to incorporate the significant effects of gravity on the shape and to precisely predict the spatial distribution of friction against the contacting surfaces, necessary for determining inchworm-type motion. We verified the model by developing a distributed discrete element representation of a continuous piezoelectric actuator and by comparing its analytical predictions to discrete-element robot simulations using PyBullet. Without gravity, predicting the shape of a sheet with a linear array of piezoelectric actuators at arbitrary voltages is straightforward. However, gravity significantly distorts the shape of the sheet, causing some segments to flatten against the ground. Our work includes the following contributions: (i) A self-consistent approach was developed to exactly determine which parts of the soft robot are lifted off the ground, and the exact shape of these sections, for an arbitrary array of piezoelectric voltages and configurations. (ii) Inchworm-type motion relies on controlling the relative friction with the ground surface in different sections of the robot. By adding torque-balance to our model and analyzing shear forces, the model can then determine the exact spatial distribution of the vertical force that the ground is exerting on the soft robot. Through this, the spatial distribution of friction forces between ground and robot can be determined. (iii) By combining this spatial friction distribution with the shape of the soft robot, in the function of time as piezoelectric actuator voltages are changed, the inchworm-type locomotion of the robot can be determined. As a practical example, we calculated the performance of a 5-actuator system on a 50-µm thick steel foil. Piezoelectric properties of commercially available thin-film piezoelectric actuators were assumed. The model predicted inchworm motion of up to 200 µm per step. For independent verification, we also modelled the system using PyBullet, a discrete-element robot simulator. To model a continuous thin-film piezoelectric actuator, we broke each actuator into multiple segments, each of which consisted of two rigid arms with appropriate mass connected with a 'motor' whose torque was set by the applied actuator voltage. Excellent agreement between our analytical model and the discrete-element simulator was shown for both for the full deformation shape and motion of the robot.Keywords: analytical modeling, piezoelectric actuators, soft robot locomotion, thin-film technology
Procedia PDF Downloads 17624883 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes
Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar
Abstract:
Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.Keywords: continuous query processing, dynamic database, moving object, skyline queries
Procedia PDF Downloads 20824882 Destruction of Coastal Wetlands in Harper City-Liberia: Setting Nature against the Future Society
Authors: Richard Adu Antwako
Abstract:
Coastal wetland destruction and its consequences have recently taken the center stage of global discussions. This phenomenon is no gray area to humanity as coastal wetland-human interaction seems inevitably ingrained in the earliest civilizations, amidst the demanding use of its resources to meet their necessities. The severity of coastal wetland destruction parallels with growing civilizations, and it is against this backdrop that, this paper interrogated the causes of coastal wetland destruction in Harper City in Liberia, compared the degree of coastal wetland stressors to the non-equilibrium thermodynamic scale as well as suggested an integrated coastal zone management to address the problems. Literature complemented the primary data gleaned via global positioning system devices, field observation, questionnaire, and interviews. Multi-sampling techniques were used to generate data from the sand miners, institutional heads, fisherfolk, community-based groups, and other stakeholders. Non-equilibrium thermodynamic theory remains vibrant in discerning the ecological stability, and it would be employed to further understand the coastal wetland destruction in Harper City, Liberia and to measure the coastal wetland stresses-amplitude and elasticity. The non-equilibrium thermodynamics postulates that the coastal wetlands are capable of assimilating resources (inputs), as well as discharging products (outputs). However, the input-output relationship exceedingly stretches beyond the thresholds of the coastal wetlands, leading to coastal wetland disequilibrium. Findings revealed that the sand mining, mangrove removal, and crude dumping have transformed the coastal wetlands, resulting in water pollution, flooding, habitat loss and disfigured beaches in Harper City in Liberia. This paper demonstrates that the coastal wetlands are converted into developmental projects and agricultural fields, thus, endangering the future society against nature.Keywords: amplitude, crude dumping, elasticity, non-equilibrium thermodynamics, wetland destruction
Procedia PDF Downloads 14024881 CompPSA: A Component-Based Pairwise RNA Secondary Structure Alignment Algorithm
Authors: Ghada Badr, Arwa Alturki
Abstract:
The biological function of an RNA molecule depends on its structure. The objective of the alignment is finding the homology between two or more RNA secondary structures. Knowing the common functionalities between two RNA structures allows a better understanding and a discovery of other relationships between them. Besides, identifying non-coding RNAs -that is not translated into a protein- is a popular application in which RNA structural alignment is the first step A few methods for RNA structure-to-structure alignment have been developed. Most of these methods are partial structure-to-structure, sequence-to-structure, or structure-to-sequence alignment. Less attention is given in the literature to the use of efficient RNA structure representation and the structure-to-structure alignment methods are lacking. In this paper, we introduce an O(N2) Component-based Pairwise RNA Structure Alignment (CompPSA) algorithm, where structures are given as a component-based representation and where N is the maximum number of components in the two structures. The proposed algorithm compares the two RNA secondary structures based on their weighted component features rather than on their base-pair details. Extensive experiments are conducted illustrating the efficiency of the CompPSA algorithm when compared to other approaches and on different real and simulated datasets. The CompPSA algorithm shows an accurate similarity measure between components. The algorithm gives the flexibility for the user to align the two RNA structures based on their weighted features (position, full length, and/or stem length). Moreover, the algorithm proves scalability and efficiency in time and memory performance.Keywords: alignment, RNA secondary structure, pairwise, component-based, data mining
Procedia PDF Downloads 45624880 Identifying Critical Success Factors for Data Quality Management through a Delphi Study
Authors: Maria Paula Santos, Ana Lucas
Abstract:
Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort
Procedia PDF Downloads 21624879 Application of Latent Class Analysis and Self-Organizing Maps for the Prediction of Treatment Outcomes for Chronic Fatigue Syndrome
Authors: Ben Clapperton, Daniel Stahl, Kimberley Goldsmith, Trudie Chalder
Abstract:
Chronic fatigue syndrome (CFS) is a condition characterised by chronic disabling fatigue and other symptoms that currently can't be explained by any underlying medical condition. Although clinical trials support the effectiveness of cognitive behaviour therapy (CBT), the success rate for individual patients is modest. Patients vary in their response and little is known which factors predict or moderate treatment outcomes. The aim of the project is to develop a prediction model from baseline characteristics of patients, such as demographics, clinical and psychological variables, which may predict likely treatment outcome and provide guidance for clinical decision making and help clinicians to recommend the best treatment. The project is aimed at identifying subgroups of patients with similar baseline characteristics that are predictive of treatment effects using modern cluster analyses and data mining machine learning algorithms. The characteristics of these groups will then be used to inform the types of individuals who benefit from a specific treatment. In addition, results will provide a better understanding of for whom the treatment works. The suitability of different clustering methods to identify subgroups and their response to different treatments of CFS patients is compared.Keywords: chronic fatigue syndrome, latent class analysis, prediction modelling, self-organizing maps
Procedia PDF Downloads 22424878 Geographic Information System Based Multi-Criteria Subsea Pipeline Route Optimisation
Authors: James Brown, Stella Kortekaas, Ian Finnie, George Zhang, Christine Devine, Neil Healy
Abstract:
The use of GIS as an analysis tool for engineering decision making is now best practice in the offshore industry. GIS enables multidisciplinary data integration, analysis and visualisation which allows the presentation of large and intricate datasets in a simple map-interface accessible to all project stakeholders. Presenting integrated geoscience and geotechnical data in GIS enables decision makers to be well-informed. This paper is a successful case study of how GIS spatial analysis techniques were applied to help select the most favourable pipeline route. Routing a pipeline through any natural environment has numerous obstacles, whether they be topographical, geological, engineering or financial. Where the pipeline is subjected to external hydrostatic water pressure and is carrying pressurised hydrocarbons, the requirement to safely route the pipeline through hazardous terrain becomes absolutely paramount. This study illustrates how the application of modern, GIS-based pipeline routing techniques enabled the identification of a single most-favourable pipeline route crossing of a challenging seabed terrain. Conventional approaches to pipeline route determination focus on manual avoidance of primary constraints whilst endeavouring to minimise route length. Such an approach is qualitative, subjective and is liable to bias towards the discipline and expertise that is involved in the routing process. For very short routes traversing benign seabed topography in shallow water this approach may be sufficient, but for deepwater geohazardous sites, the need for an automated, multi-criteria, and quantitative approach is essential. This study combined multiple routing constraints using modern least-cost-routing algorithms deployed in GIS, hitherto unachievable with conventional approaches. The least-cost-routing procedure begins with the assignment of geocost across the study area. Geocost is defined as a numerical penalty score representing hazard posed by each routing constraint (e.g. slope angle, rugosity, vulnerability to debris flows) to the pipeline. All geocosted routing constraints are combined to generate a composite geocost map that is used to compute the least geocost route between two defined terminals. The analyses were applied to select the most favourable pipeline route for a potential gas development in deep water. The study area is geologically complex with a series of incised, potentially active, canyons carved into a steep escarpment, with evidence of extensive debris flows. A similar debris flow in the future could cause significant damage to a poorly-placed pipeline. Protruding inter-canyon spurs offer lower-gradient options for ascending an escarpment but the vulnerability of periodic failure of these spurs is not well understood. Close collaboration between geoscientists, pipeline engineers, geotechnical engineers and of course the gas export pipeline operator guided the analyses and assignment of geocosts. Shorter route length, less severe slope angles, and geohazard avoidance were the primary drivers in identifying the most favourable route.Keywords: geocost, geohazard, pipeline route determination, pipeline route optimisation, spatial analysis
Procedia PDF Downloads 40524877 Hydro-Meteorological Vulnerability and Planning in Urban Area: The Case of Yaoundé City in Cameroon
Authors: Ouabo Emmanuel Romaric, Amougou Armathe
Abstract:
Background and aim: The study of impacts of floods and landslides at a small scale, specifically in the urban areas of developing countries is done to provide tools and actors for a better management of risks in such areas, which are now being affected by climate change. The main objective of this study is to assess the hydrometeorological vulnerabilities associated with flooding and urban landslides to propose adaptation measures. Methods: Climatic data analyses were done by calculation of indices of climate change within 50 years (1960-2012). Analyses of field data to determine causes, the level of risk and its consequences on the area of study was carried out using SPSS 18 software. The cartographic analysis and GIS were used to refine the work in space. Then, spatial and terrain analyses were carried out to determine the morphology of field in relation with floods and landslide, and the diffusion on the field. Results: The interannual changes in precipitation has highlighted the surplus years (21), the deficit years (24) and normal years (7). Barakat method bring out evolution of precipitation by jerks and jumps. Floods and landslides are correlated to high precipitation during surplus and normal years. Data field analyses show that populations are conscious (78%) of the risks with 74% of them exposed, but their capacities of adaptation is very low (51%). Floods are the main risk. The soils are classed as feralitic (80%), hydromorphic (15%) and raw mineral (5%). Slope variation (5% to 15%) of small hills and deep valley with anarchic construction favor flood and landslide during heavy precipitation. Mismanagement of waste produce blocks free circulation of river and accentuate floods. Conclusion: Vulnerability of population to hydrometeorological risks in Yaoundé VI is the combination of variation of parameters like precipitation, temperature due to climate change, and the bad planning of construction in urban areas. Because of lack of channels for water to circulate due to saturation of soils, the increase of heavy precipitation and mismanagement of waste, the result are floods and landslides which causes many damages on goods and people.Keywords: climate change, floods, hydrometeorological, vulnerability
Procedia PDF Downloads 46524876 Geographic Mapping of Tourism in Rural Areas: A Case Study of Cumbria, United Kingdom
Authors: Emma Pope, Demos Parapanos
Abstract:
Rural tourism has become more obvious and prevalent, with tourists’ increasingly seeking authentic experiences. This movement accelerated post-Covid, putting destinations in danger of reaching levels of saturation called ‘overtourism’. Whereas the phenomenon of overtourism has been frequently discussed in the urban context by academics and practitioners over recent years, it has hardly been referred to in the context of rural tourism, where perhaps it is even more difficult to manage. Rural tourism was historically considered small-scale, marked by its traditional character and by having little impact on nature and rural society. The increasing number of rural areas experiencing overtourism, however, demonstrates the need for new approaches, especially as the impacts and enablers of overtourism are context specific. Cumbria, with approximately 47 million visitors each year, and 23,000 operational enterprises, is one of these rural areas experiencing overtourism in the UK. Using the county of Cumbria as an example, this paper aims to explore better planning and management in rural destinations by clustering the area into rural and ‘urban-rural’ tourism zones. To achieve the aim, this study uses secondary data from a variety of sources to identify variables relating to visitor economy development and demand. These data include census data relating to population and employment, tourism industry-specific data including tourism revenue, visitor activities, and accommodation stock, and big data sources such as Trip Advisor and All Trails. The combination of these data sources provides a breadth of tourism-related variables. The subsequent analysis of this data draws upon various validated models. For example, tourism and hospitality employment density, territorial tourism pressure, and accommodation density. In addition to these statistical calculations, other data are utilized to further understand the context of these zones, for example, tourist services, attractions, and activities. The data was imported into ARCGIS where the density of the different variables is visualized on maps. This study aims to provide an understanding of the geographical context of visitor economy development and tourist behavior in rural areas. The findings contribute to an understanding of the spatial dynamics of tourism within the region of Cumbria through the creation of thematized maps. Different zones of tourism industry clusters are identified, which include elements relating to attractions, enterprises, infrastructure, tourism employment and economic impact. These maps visualize hot and cold spots relating to a variety of tourism contexts. It is believed that the strategy used to provide a visual overview of tourism development and demand in Cumbria could provide a strategic tool for rural areas to better plan marketing opportunities and avoid overtourism. These findings can inform future sustainability policy and destination management strategies within the areas through an understanding of the processes behind the emergence of both hot and cold spots. It may mean that attract and disperse needs to be reviewed in terms of a strategic option. In other words, to use sector or zonal policies for the individual hot or cold areas with transitional zones dependent upon local economic, social and environmental factors.Keywords: overtourism, rural tourism, sustainable tourism, tourism planning, tourism zones
Procedia PDF Downloads 7324875 Examining Litter Distributions in Lethbridge, Alberta, Canada, Using Citizen Science and GIS Methods: OpenLitterMap App and Story Maps
Authors: Tali Neta
Abstract:
Humans’ impact on the environment has been incredibly brutal, with enormous plastic- and other pollutants (e.g., cigarette buds, paper cups, tires) worldwide. On land, litter costs taxpayers a fortune. Most of the litter pollution comes from the land, yet it is one of the greatest hazards to marine environments. Due to spatial and temporal limitations, previous litter data covered very small areas. Currently, smartphones can be used to obtain information on various pollutants (through citizen science), and they can greatly assist in acknowledging and mitigating the environmental impact of litter. Litter app data, such as the Litterati, are available for study through a global map only; these data are not available for download, and it is not clear whether irrelevant hashtags have been eliminated. Instagram and Twitter open-source geospatial data are available for download; however, these are considered inaccurate, computationally challenging, and impossible to quantify. Therefore, the resulting data are of poor quality. Other downloadable geospatial data (e.g., Marine Debris Tracker8 and Clean Swell10) are focused on marine- rather than terrestrial litter. Therefore, accurate terrestrial geospatial documentation of litter distribution is needed to improve environmental awareness. The current research employed citizen science to examine litter distribution in Lethbridge, Alberta, Canada, using the OpenLitterMap (OLM) app. The OLM app is an application used to track litter worldwide, and it can mark litter locations through photo georeferencing, which can be presented through GIS-designed maps. The OLM app provides open-source data that can be downloaded. It also offers information on various litter types and “hot-spots” areas where litter accumulates. In this study, Lethbridge College students collected litter data with the OLM app. The students produced GIS Story Maps (interactive web GIS illustrations) and presented these to school children to improve awareness of litter's impact on environmental health. Preliminary results indicate that towards the Lethbridge Coulees’ (valleys) East edges, the amount of litter significantly increased due to shrubs’ presence, that acted as litter catches. As wind generally travels from west to east in Lethbridge, litter in West-Lethbridge often finds its way down in the east part of the coulees. The students’ documented various litter types, while the majority (75%) included plastic and paper food packaging. The students also found metal wires, broken glass, plastic bottles, golf balls, and tires. Presentations of the Story Maps to school children had a significant impact, as the children voluntarily collected litter during school recess, and they were looking into solutions to reduce litter. Further litter distribution documentation through Citizen Science is needed to improve public awareness. Additionally, future research will be focused on Drone imagery of highly concentrated litter areas. Finally, a time series analysis of litter distribution will help us determine whether public education through Citizen Science and Story Maps can assist in reducing litter and reaching a cleaner and healthier environment.Keywords: citizen science, litter pollution, Open Litter Map, GIS Story Map
Procedia PDF Downloads 7924874 Implementation of Successive Interference Cancellation Algorithms in the 5g Downlink
Authors: Mokrani Mohamed Amine
Abstract:
In this paper, we have implemented successive interference cancellation algorithms in the 5G downlink. We have calculated the maximum throughput in Frequency Division Duplex (FDD) mode in the downlink, where we have obtained a value equal to 836932 b/ms. The transmitter is of type Multiple Input Multiple Output (MIMO) with eight transmitting and receiving antennas. Each antenna among eight transmits simultaneously a data rate of 104616 b/ms that contains the binary messages of the three users; in this case, the Cyclic Redundancy Check CRC is negligible, and the MIMO category is the spatial diversity. The technology used for this is called Non-Orthogonal Multiple Access (NOMA) with a Quadrature Phase Shift Keying (QPSK) modulation. The transmission is done in a Rayleigh fading channel with the presence of obstacles. The MIMO Successive Interference Cancellation (SIC) receiver with two transmitting and receiving antennas recovers its binary message without errors for certain values of transmission power such as 50 dBm, with 0.054485% errors when the transmitted power is 20dBm and with 0.00286763% errors for a transmitted power of 32 dBm(in the case of user 1) as well as with 0.0114705% errors when the transmitted power is 20 dBm also with 0.00286763% errors for a power of 24 dBm(in the case of user2) by applying the steps involved in SIC.Keywords: 5G, NOMA, QPSK, TBS, LDPC, SIC, capacity
Procedia PDF Downloads 10224873 Hyperspectral Mapping Methods for Differentiating Mangrove Species along Karachi Coast
Authors: Sher Muhammad, Mirza Muhammad Waqar
Abstract:
It is necessary to monitor and identify mangroves types and spatial extent near coastal areas because it plays an important role in coastal ecosystem and environmental protection. This research aims at identifying and mapping mangroves types along Karachi coast ranging from 24.79 to 24.85 degree in latitude and 66.91 to 66.97 degree in longitude using hyperspectral remote sensing data and techniques. Image acquired during February, 2012 through Hyperion sensor have been used for this research. Image preprocessing includes geometric and radiometric correction followed by Minimum Noise Fraction (MNF) and Pixel Purity Index (PPI). The output of MNF and PPI has been analyzed by visualizing it in n-dimensions for end-member extraction. Well-distributed clusters on the n-dimensional scatter plot have been selected with the region of interest (ROI) tool as end members. These end members have been used as an input for classification techniques applied to identify and map mangroves species including Spectral Angle Mapper (SAM), Spectral Feature Fitting (SFF), and Spectral Information Diversion (SID). Only two types of mangroves namely Avicennia Marina (white mangroves) and Avicennia Germinans (black mangroves) have been observed throughout the study area.Keywords: mangrove, hyperspectral, hyperion, SAM, SFF, SID
Procedia PDF Downloads 36124872 Spatial Distribution of Land Use in the North Canal of Beijing Subsidiary Center and Its Impact on the Water Quality
Authors: Alisa Salimova, Jiane Zuo, Christopher Homer
Abstract:
The objective of this study is to analyse the North Canal riparian zone land use with the help of remote sensing analysis in ArcGis using 30 cloudless Landsat8 open-source satellite images from May to August of 2013 and 2017. Land cover, urban construction, heat island effect, vegetation cover, and water system change were chosen as the main parameters and further analysed to evaluate its impact on the North Canal water quality. The methodology involved the following steps: firstly, 30 cloudless satellite images were collected from the Landsat TM image open-source database. The visual interpretation method was used to determine different land types in a catchment area. After primary and secondary classification, 28 land cover types in total were classified. Visual interpretation method was used with the help ArcGIS for the grassland monitoring, US Landsat TM remote sensing image processing with a resolution of 30 meters was used to analyse the vegetation cover. The water system was analysed using the visual interpretation method on the GIS software platform to decode the target area, water use and coverage. Monthly measurements of water temperature, pH, BOD, COD, ammonia nitrogen, total nitrogen and total phosphorus in 2013 and 2017 were taken from three locations of the North Canal in Tongzhou district. These parameters were used for water quality index calculation and compared to land-use changes. The results of this research were promising. The vegetation coverage of North Canal riparian zone in 2017 was higher than the vegetation coverage in 2013. The surface brightness temperature value was positively correlated with the vegetation coverage density and the distance from the surface of the water bodies. This indicates that the vegetation coverage and water system have a great effect on temperature regulation and urban heat island effect. Surface temperature in 2017 was higher than in 2013, indicating a global warming effect. The water volume in the river area has been partially reduced, indicating the potential water scarcity risk in North Canal watershed. Between 2013 and 2017, urban residential, industrial and mining storage land areas significantly increased compared to other land use types; however, water quality has significantly improved in 2017 compared to 2013. This observation indicates that the Tongzhou Water Restoration Plan showed positive results and water management of Tongzhou district had been improved.Keywords: North Canal, land use, riparian vegetation, river ecology, remote sensing
Procedia PDF Downloads 10924871 Incorporation of Growth Factors onto Hydrogels via Peptide Mediated Binding for Development of Vascular Networks
Authors: Katie Kilgour, Brendan Turner, Carly Catella, Michael Daniele, Stefano Menegatti
Abstract:
In vivo, the extracellular matrix (ECM) provides biochemical and mechanical properties that are instructional to resident cells to form complex tissues with characteristics to develop and support vascular networks. In vitro, the development of vascular networks can be guided by biochemical patterning of substrates via spatial distribution and display of peptides and growth factors to prompt cell adhesion, differentiation, and proliferation. We have developed a technique utilizing peptide ligands that specifically bind vascular endothelial growth factor (VEGF), erythropoietin (EPO), or angiopoietin-1 (ANG1) to spatiotemporally distribute growth factors to cells. This allows for the controlled release of each growth factor, ultimately enhancing the formation of a vascular network. Our engineered tissue constructs (ETCs) are fabricated out of gelatin methacryloyl (GelMA), which is an ideal substrate for tailored stiffness and bio-functionality, and covalently patterned with growth factor specific peptides. These peptides mimic growth factor receptors, facilitating the non-covalent binding of the growth factors to the ETC, allowing for facile uptake by the cells. We have demonstrated in the absence of cells the binding affinity of VEGF, EPO, and ANG1 to their respective peptides and the ability for each to be patterned onto a GelMA substrate. The ability to organize growth factors on an ETC provides different functionality to develop organized vascular networks. Our results demonstrated a method to incorporate biochemical cues into ETCs that enable spatial and temporal control of growth factors. Future efforts will investigate the cellular response by evaluating gene expression, quantifying angiogenic activity, and measuring the speed of growth factor consumption.Keywords: growth factor, hydrogel, peptide, angiogenesis, vascular, patterning
Procedia PDF Downloads 16224870 Assessment of Risk Factors in Residential Areas of Bosso in Minna, Nigeria
Authors: Junaid Asimiyu Mohammed, Olakunle Docas Tosin
Abstract:
The housing environment in many developing countries is fraught with risks that have potential negative impacts on the lives of the residents. The study examined the risk factors in residential areas of two neighborhoods in Bosso Local Government Areas of Minna in Nigeria with a view to determining the level of their potential impacts. A sample of 378 households was drawn from the estimated population of 22,751 household heads. The questionnaire and direct observation were used as instruments for data collection. The data collected were analyzed using the Relative Importance Index (RII) rule to determine the level of the potential impact of the risk factors while ArcGIS was used for mapping the spatial distribution of the risks. The study established that the housing environment of Angwan Biri and El-Waziri areas of Bosso is poor and vulnerable as 26% of the houses were not habitable and 57% were only fairly habitable. The risks of epidemics, building collapse and rainstorms were evident in the area as 53% of the houses had poor ventilation; 20% of residents had no access to toilets; 47% practiced open waste dumping; 46% of the houses had cracked walls while 52% of the roofs were weak and sagging. The results of the analysis of the potential impact of the risk factors indicate a RII score of 0.528 for building collapse, 0.758 for rainstorms and 0.830 for epidemics, indicating a moderate to very high level of potential impacts. The mean RII score of 0.639 shows a significant potential impact of the risk factors. The study recommends the implementation of sanitation measures, provision of basic urban facilities and neighborhood revitalization through housing infrastructure retrofitting as measures to mitigate the risks of disasters and improve the living conditions of the residents of the study area.Keywords: assessment, risk, residential, Nigeria
Procedia PDF Downloads 5624869 Analysis of Weather Variability Impact on Yields of Some Crops in Southwest, Nigeria
Authors: Olumuyiwa Idowu Ojo, Oluwatobi Peter Olowo
Abstract:
The study developed a Geographical Information Systems (GIS) database and mapped inter-annual changes in crop yields of cassava, cowpea, maize, rice, melon and yam as a response to inter-annual rainfall and temperature variability in Southwest, Nigeria. The aim of this project is to study the comparative analysis of the weather variability impact of six crops yield (Rice, melon, yam, cassava, Maize and cowpea) in South Western States of Nigeria (Oyo, Osun, Ekiti, Ondo, Ogun and Lagos) from 1991 – 2007. The data was imported and analysed in the Arch GIS 9 – 3 software environment. The various parameters (temperature, rainfall, crop yields) were interpolated using the kriging method. The results generated through interpolation were clipped to the study area. Geographically weighted regression was chosen from the spatial statistics toolbox in Arch GIS 9.3 software to analyse and predict the relationship between temperature, rainfall and the different crops (Cowpea, maize, rice, melon, yam, and cassava).Keywords: GIS, crop yields, comparative analysis, temperature, rainfall, weather variability
Procedia PDF Downloads 32224868 Privacy Concerns and Law Enforcement Data Collection to Tackle Domestic and Sexual Violence
Authors: Francesca Radice
Abstract:
Domestic and sexual violence provokes, on average in Australia, one female death per week due to intimate violence behaviours. 83% of couples meet online, and intercepting domestic and sexual violence at this level would be beneficial. It has been observed that violent or coercive behaviour has been apparent from initial conversations on dating apps like Tinder. Child pornography, stalking, and coercive control are some criminal offences from dating apps, including women murdered after finding partners through Tinder. Police databases and predictive policing are novel approaches taken to prevent crime before harm is done. This research will investigate how police databases can be used in a privacy-preserving way to characterise users in terms of their potential for violent crime. Using the COPS database of NSW Police, we will explore how the past criminal record can be interpreted to yield a category of potential danger for each dating app user. It is up to the judgement of each subscriber on what degree of the potential danger they are prepared to enter into. Sentiment analysis is an area where research into natural language processing has made great progress over the last decade. This research will investigate how sentiment analysis can be used to interpret interchanges between dating app users to detect manipulative or coercive sentiments. These can be used to alert law enforcement if continued for a defined number of communications. One of the potential problems of this approach is the potential prejudice a categorisation can cause. Another drawback is the possibility of misinterpreting communications and involving law enforcement without reason. The approach will be thoroughly tested with cross-checks by human readers who verify both the level of danger predicted by the interpretation of the criminal record and the sentiment detected from personal messages. Even if only a few violent crimes can be prevented, the approach will have a tangible value for real people.Keywords: sentiment analysis, data mining, predictive policing, virtual manipulation
Procedia PDF Downloads 7724867 Virtual Metering and Prediction of Heating, Ventilation, and Air Conditioning Systems Energy Consumption by Using Artificial Intelligence
Authors: Pooria Norouzi, Nicholas Tsang, Adam van der Goes, Joseph Yu, Douglas Zheng, Sirine Maleej
Abstract:
In this study, virtual meters will be designed and used for energy balance measurements of an air handling unit (AHU). The method aims to replace traditional physical sensors in heating, ventilation, and air conditioning (HVAC) systems with simulated virtual meters. Due to the inability to manage and monitor these systems, many HVAC systems have a high level of inefficiency and energy wastage. Virtual meters are implemented and applied in an actual HVAC system, and the result confirms the practicality of mathematical sensors for alternative energy measurement. While most residential buildings and offices are commonly not equipped with advanced sensors, adding, exploiting, and monitoring sensors and measurement devices in the existing systems can cost thousands of dollars. The first purpose of this study is to provide an energy consumption rate based on available sensors and without any physical energy meters. It proves the performance of virtual meters in HVAC systems as reliable measurement devices. To demonstrate this concept, mathematical models are created for AHU-07, located in building NE01 of the British Columbia Institute of Technology (BCIT) Burnaby campus. The models will be created and integrated with the system’s historical data and physical spot measurements. The actual measurements will be investigated to prove the models' accuracy. Based on preliminary analysis, the resulting mathematical models are successful in plotting energy consumption patterns, and it is concluded confidently that the results of the virtual meter will be close to the results that physical meters could achieve. In the second part of this study, the use of virtual meters is further assisted by artificial intelligence (AI) in the HVAC systems of building to improve energy management and efficiency. By the data mining approach, virtual meters’ data is recorded as historical data, and HVAC system energy consumption prediction is also implemented in order to harness great energy savings and manage the demand and supply chain effectively. Energy prediction can lead to energy-saving strategies and considerations that can open a window in predictive control in order to reach lower energy consumption. To solve these challenges, the energy prediction could optimize the HVAC system and automates energy consumption to capture savings. This study also investigates AI solutions possibility for autonomous HVAC efficiency that will allow quick and efficient response to energy consumption and cost spikes in the energy market.Keywords: virtual meters, HVAC, artificial intelligence, energy consumption prediction
Procedia PDF Downloads 10424866 Structured-Ness and Contextual Retrieval Underlie Language Comprehension
Authors: Yao-Ying Lai, Maria Pinango, Ashwini Deo
Abstract:
While grammatical devices are essential to language processing, how comprehension utilizes cognitive mechanisms is less emphasized. This study addresses this issue by probing the complement coercion phenomenon: an entity-denoting complement following verbs like begin and finish receives an eventive interpretation. For example, (1) “The queen began the book” receives an agentive reading like (2) “The queen began [reading/writing/etc.…] the book.” Such sentences engender additional processing cost in real-time comprehension. The traditional account attributes this cost to an operation that coerces the entity-denoting complement to an event, assuming that these verbs require eventive complements. However, in closer examination, examples like “Chapter 1 began the book” undermine this assumption. An alternative, Structured Individual (SI) hypothesis, proposes that the complement following aspectual verbs (AspV; e.g. begin, finish) is conceptualized as a structured individual, construed as an axis along various dimensions (e.g. spatial, eventive, temporal, informational). The composition of an animate subject and an AspV such as (1) engenders an ambiguity between an agentive reading along the eventive dimension like (2), and a constitutive reading along the informational/spatial dimension like (3) “[The story of the queen] began the book,” in which the subject is interpreted as a subpart of the complement denotation. Comprehenders need to resolve the ambiguity by searching contextual information, resulting in additional cost. To evaluate the SI hypothesis, a questionnaire was employed. Method: Target AspV sentences such as “Shakespeare began the volume.” were preceded by one of the following types of context sentence: (A) Agentive-biasing, in which an event was mentioned (…writers often read…), (C) Constitutive-biasing, in which a constitutive meaning was hinted (Larry owns collections of Renaissance literature.), (N) Neutral context, which allowed both interpretations. Thirty-nine native speakers of English were asked to (i) rate each context-target sentence pair from a 1~5 scale (5=fully understandable), and (ii) choose possible interpretations for the target sentence given the context. The SI hypothesis predicts that comprehension is harder for the Neutral condition, as compared to the biasing conditions because no contextual information is provided to resolve an ambiguity. Also, comprehenders should obtain the specific interpretation corresponding to the context type. Results: (A) Agentive-biasing and (C) Constitutive-biasing were rated higher than (N) Neutral conditions (p< .001), while all conditions were within the acceptable range (> 3.5 on the 1~5 scale). This suggests that when lacking relevant contextual information, semantic ambiguity decreases comprehensibility. The interpretation task shows that the participants selected the biased agentive/constitutive reading for condition (A) and (C) respectively. For the Neutral condition, the agentive and constitutive readings were chosen equally often. Conclusion: These findings support the SI hypothesis: the meaning of AspV sentences is conceptualized as a parthood relation involving structured individuals. We argue that semantic representation makes reference to spatial structured-ness (abstracted axis). To obtain an appropriate interpretation, comprehenders utilize contextual information to enrich the conceptual representation of the sentence in question. This study connects semantic structure to human’s conceptual structure, and provides a processing model that incorporates contextual retrieval.Keywords: ambiguity resolution, contextual retrieval, spatial structured-ness, structured individual
Procedia PDF Downloads 33124865 Coastal Flood Mapping of Vulnerability Due to Sea Level Rise and Extreme Weather Events: A Case Study of St. Ives, UK
Authors: S. Vavias, T. R. Brewer, T. S. Farewell
Abstract:
Coastal floods have been identified as an important natural hazard that can cause significant damage to the populated built-up areas, related infrastructure and also ecosystems and habitats. This study attempts to fill the gap associated with the development of preliminary assessments of coastal flood vulnerability for compliance with the EU Directive on the Assessment and Management of Flood Risks (2007/60/EC). In this context, a methodology has been created by taking into account three major parameters; the maximum wave run-up modelled from historical weather observations, the highest tide according to historic time series, and the sea level rise projections due to climate change. A high resolution digital terrain model (DTM) derived from LIDAR data has been used to integrate the estimated flood events in a GIS environment. The flood vulnerability map created shows potential risk areas and can play a crucial role in the coastal zone planning process. The proposed method has the potential to be a powerful tool for policy and decision makers for spatial planning and strategic management.Keywords: coastal floods, vulnerability mapping, climate change, extreme weather events
Procedia PDF Downloads 39424864 Comprehensive Study of Data Science
Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly
Abstract:
Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.Keywords: data science, machine learning, data analytics, artificial intelligence
Procedia PDF Downloads 8024863 Bioengineering of a Plant System to Sustainably Remove Heavy Metals and to Harvest Rare Earth Elements (REEs) from Industrial Wastes
Authors: Edmaritz Hernandez-Pagan, Kanjana Laosuntisuk, Alex Harris, Allison Haynes, David Buitrago, Michael Kudenov, Colleen Doherty
Abstract:
Rare Earth Elements (REEs) are critical metals for modern electronics, green technologies, and defense systems. However, due to their dispersed nature in the Earth’s crust, frequent co-occurrence with radioactive materials, and similar chemical properties, acquiring and purifying REEs is costly and environmentally damaging, restricting access to these metals. Plants could serve as resources for bioengineering REE mining systems. Although there is limited information on how REEs affect plants at a cellular and molecular level, plants with high REE tolerance and hyperaccumulation have been identified. This dissertation aims to develop a plant-based system for harvesting REEs from industrial waste material with a focus on Acid Mine Drainage (AMD), a toxic coal mining product. The objectives are 1) to develop a non-destructive, in vivo detection method for REE detection in Phytolacca plants (REE hyperaccumulator) plants utilizing fluorescence spectroscopy and with a primary focus on dysprosium, 2) to characterize the uptake of REE and Heavy Metals in Phytolacca americana and Phytolacca acinosa (REE hyperaccumulator) in AMD for potential implementation in the plant-based system, 3) to implement the REE detection method to identify REE-binding proteins and peptides for potential enhancement of uptake and selectivity for targeted REEs in the plants implemented in the plant-based system. The candidates are known REE-binding peptides or proteins, orthologs of known metal-binding proteins from REE hyperaccumulator plants, and novel proteins and peptides identified by comparative plant transcriptomics. Lanmodulin, a high-affinity REE-binding protein from methylotrophic bacteria, is used as a benchmark for the REE-protein binding fluorescence assays and expression in A. thaliana to test for changes in REE plant tolerance and uptake.Keywords: phytomining, agromining, rare earth elements, pokeweed, phytolacca
Procedia PDF Downloads 1324862 Geomechanical Technologies for Assessing Three-Dimensional Stability of Underground Excavations Utilizing Remote-Sensing, Finite Element Analysis, and Scientific Visualization
Authors: Kwang Chun, John Kemeny
Abstract:
Light detection and ranging (LiDAR) has been a prevalent remote-sensing technology applied in the geological fields due to its high precision and ease of use. One of the major applications is to use the detailed geometrical information of underground structures as a basis for the generation of a three-dimensional numerical model that can be used in a geotechnical stability analysis such as FEM or DEM. To date, however, straightforward techniques in reconstructing the numerical model from the scanned data of the underground structures have not been well established or tested. In this paper, we propose a comprehensive approach integrating all the various processes, from LiDAR scanning to finite element numerical analysis. The study focuses on converting LiDAR 3D point clouds of geologic structures containing complex surface geometries into a finite element model. This methodology has been applied to Kartchner Caverns in Arizona, where detailed underground and surface point clouds can be used for the analysis of underground stability. Numerical simulations were performed using the finite element code Abaqus and presented by 3D computing visualization solution, ParaView. The results are useful in studying the stability of all types of underground excavations including underground mining and tunneling.Keywords: finite element analysis, LiDAR, remote-sensing, scientific visualization, underground stability
Procedia PDF Downloads 17024861 Integrating Dynamic Brain Connectivity and Transcriptomic Imaging in Major Depressive Disorder
Authors: Qingjin Liu, Jinpeng Niu, Kangjia Chen, Jiao Li, Huafu Chen, Wei Liao
Abstract:
Functional connectomics is essential in cognitive science and neuropsychiatry, offering insights into the brain's complex network structures and dynamic interactions. Although neuroimaging has uncovered functional connectivity issues in Major Depressive Disorder (MDD) patients, the dynamic shifts in connectome topology and their link to gene expression are yet to be fully understood. To explore the differences in dynamic connectome topology between MDD patients and healthy individuals, we conducted an extensive analysis of resting-state functional magnetic resonance imaging (fMRI) data from 434 participants (226 MDD patients and 208 controls). We used multilayer network models to evaluate brain module dynamics and examined the association between whole-brain gene expression and dynamic module variability in MDD using publicly available transcriptomic data. Our findings revealed that compared to healthy individuals, MDD patients showed lower global mean values and higher standard deviations, indicating unstable patterns and increased regional differentiation. Notably, MDD patients exhibited more frequent module switching, primarily within the executive control network (ECN), particularly in the left dorsolateral prefrontal cortex and right fronto-insular regions, whereas the default mode network (DMN), including the superior frontal gyrus, temporal lobe, and right medial prefrontal cortex, displayed lower variability. These brain dynamics predicted the severity of depressive symptoms. Analyzing human brain gene expression data, we found that the spatial distribution of MDD-related gene expression correlated with dynamic module differences. Cell type-specific gene analyses identified oligodendrocytes (OPCs) as major contributors to the transcriptional relationships underlying module variability in MDD. To the best of our knowledge, this is the first comprehensive description of altered brain module dynamics in MDD patients linked to depressive symptom severity and changes in whole-brain gene expression profiles.Keywords: major depressive disorder, module dynamics, magnetic resonance imaging, transcriptomic
Procedia PDF Downloads 2524860 The Axonal Connectivity of Motor and Premotor Areas as Revealed through Fiber Dissections: Shedding Light on the Structural Correlates of Complex Motor Behavior
Authors: Spyridon Komaitis, Christos Koutsarnakis, Evangelos Drosos, Aristotelis Kalyvas
Abstract:
This study opts to investigate the intrinsic architecture, morphology, and spatial relationship of the subcortical pathways implicated in the connectivity of the motor/premotor cortex and SMA/pre-SMA complex. Twenty normal, adult, formalin-fixed cerebral hemispheres were explored through the fiber micro-dissection technique. Lateral to medial and medial to lateral dissections focused on the area of interest were performed in a tandem manner and under the surgical microscope. We traced the subcortical architecture, spatial relationships, and axonal connectivity of four major pathways: a) the dorsal component of the SLF (SLF-I) was found to reside in the medial aspect of the hemisphere and seen to connect the precuneus with the SMA and pre-SMA complex, b) the frontal longitudinal system (FLS) was consistently encountered as the natural anterior continuation of the SLF-II and SLF-III and connected the premotor and prefrontal cortices c) the fronto-caudate tract (FCT), a fan-shaped tract, was documented to participate in connectivity of the prefrontal and premotor cortices to the head and body of the caudate nucleus and d) the cortico-tegmental tract(CTT) was invariably recorded to subserve the connectivity of the tegmental area with the fronto-parietal cortex. No hemispheric asymmetries were recorded for any of the implicated pathways. Sub-segmentation systems were also proposed for each of the aforementioned tracts. The structural connectivity and functional specialization of motor and premotor areas in the human brain remain vague to this day as most of the available evidence derives either from animal or tractographic studies. By using the fiber-microdissection technique as our main method of investigation, we provide sound structural evidence on the delicate anatomy of the related white matter pathways.Keywords: neuroanatomy, premotor, motor, connectivity
Procedia PDF Downloads 12624859 Scientific Investigation for an Ancient Egyptian Polychrome Wooden Stele
Authors: Ahmed Abdrabou, Medhat Abdalla
Abstract:
The studied stele dates back to Third Intermediate Period (1075-664) B.C in an ancient Egypt. It is made of wood and covered with painted gesso layers. This study aims to use a combination of multi spectral imaging {visible, infrared (IR), Visible-induced infrared luminescence (VIL), Visible-induced ultraviolet luminescence (UVL) and ultraviolet reflected (UVR)}, along with portable x-ray fluorescence in order to map and identify the pigments as well as to provide a deeper understanding of the painting techniques. Moreover; the authors were significantly interested in the identification of wood species. Multispectral imaging acquired in 3 spectral bands, ultraviolet (360-400 nm), visible (400-780 nm) and infrared (780-1100 nm) using (UV Ultraviolet-induced luminescence (UVL), UV Reflected (UVR), Visible (VIS), Visible-induced infrared luminescence (VIL) and Infrared photography. False color images are made by digitally editing the VIS with IR or UV images using Adobe Photoshop. Optical Microscopy (OM), potable X-ray fluorescence spectroscopy (p-XRF) and Fourier Transform Infrared Spectroscopy (FTIR) were used in this study. Mapping and imaging techniques provided useful information about the spatial distribution of pigments, in particular visible-induced luminescence (VIL) which allowed the spatial distribution of Egyptian blue pigment to be mapped and every region containing Egyptian blue, even down to single crystals in some instances, is clearly visible as a bright white area; however complete characterization of the pigments requires the use of p. XRF spectroscopy. Based on the elemental analysis found by P.XRF, we conclude that the artists used mixtures of the basic mineral pigments to achieve a wider palette of hues. Identification of wood species Microscopic identification indicated that the wood used was Sycamore Fig (Ficus sycomorus L.) which is recorded as being native to Egypt and was used to make wooden artifacts since at least the Fifth Dynasty.Keywords: polychrome wooden stele, multispectral imaging, IR luminescence, Wood identification, Sycamore Fig, p-XRF
Procedia PDF Downloads 26224858 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 665