Search results for: anti-cyber and information technology crimes law
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16358

Search results for: anti-cyber and information technology crimes law

11408 Implementation of Building Information Modelling to Monitor, Assess, and Control the Indoor Environmental Quality of Higher Education Buildings

Authors: Mukhtar Maigari

Abstract:

The landscape of Higher Education (HE) institutions, especially following the CVID-19 pandemic, necessitates advanced approaches to manage Indoor Environmental Quality (IEQ) which is crucial for the comfort, health, and productivity of students and staff. This study investigates the application of Building Information Modelling (BIM) as a multifaceted tool for monitoring, assessing, and controlling IEQ in HE buildings aiming to bridge the gap between traditional management practices and the innovative capabilities of BIM. Central to the study is a comprehensive literature review, which lays the foundation by examining current knowledge and technological advancements in both IEQ and BIM. This review sets the stage for a deeper investigation into the practical application of BIM in IEQ management. The methodology consists of Post-Occupancy Evaluation (POE) which encompasses physical monitoring, questionnaire surveys, and interviews under the umbrella of case studies. The physical data collection focuses on vital IEQ parameters such as temperature, humidity, CO2 levels etc, conducted by using different equipment including dataloggers to ensure accurate data. Complementing this, questionnaire surveys gather perceptions and satisfaction levels from students, providing valuable insights into the subjective aspects of IEQ. The interview component, targeting facilities management teams, offers an in-depth perspective on IEQ management challenges and strategies. The research delves deeper into the development of a conceptual BIM-based framework, informed by the insight findings from case studies and empirical data. This framework is designed to demonstrate the critical functions necessary for effective IEQ monitoring, assessment, control and automation with real time data handling capabilities. This BIM-based framework leads to the developing and testing a BIM-based prototype tool. This prototype leverages on software such as Autodesk Revit with its visual programming tool i.e., Dynamo and an Arduino-based sensor network thereby allowing for real-time flow of IEQ data for monitoring, control and even automation. By harnessing the capabilities of BIM technology, the study presents a forward-thinking approach that aligns with current sustainability and wellness goals, particularly vital in the post-COVID-19 era. The integration of BIM in IEQ management promises not only to enhance the health, comfort, and energy efficiency of educational environments but also to transform them into more conducive spaces for teaching and learning. Furthermore, this research could influence the future of HE buildings by prompting universities and government bodies to revaluate and improve teaching and learning environments. It demonstrates how the synergy between IEQ and BIM can empower stakeholders to monitor IEQ conditions more effectively and make informed decisions in real-time. Moreover, the developed framework has broader applications as well; it can serve as a tool for other sustainability assessments, like energy analysis in HE buildings, leveraging measured data synchronized with the BIM model. In conclusion, this study bridges the gap between theoretical research and real-world application by practicalizing how advanced technologies like BIM can be effectively integrated to enhance environmental quality in educational institutions. It portrays the potential of integrating advanced technologies like BIM in the pursuit of improved environmental conditions in educational institutions.

Keywords: BIM, POE, IEQ, HE-buildings

Procedia PDF Downloads 42
11407 Carbon Accounting for Sustainable Design and Manufacturing in the Signage Industry

Authors: Prudvi Paresi, Fatemeh Javidan

Abstract:

In recent years, greenhouse gas, or in particular, carbon emissions, have received special attention from environmentalists and designers due to the fact that they significantly contribute to the temperature rise. The building industry is one of the top seven major industries contributing to embodied carbon emission. Signage systems are an integral part of the building industry and bring completeness to the space-building by providing the required information and guidance. A significant amount of building materials, such as steel, aluminium, acrylic, LED, etc., are utilized in these systems, but very limited information is available on their sustainability and carbon footprint. Therefore, there is an urgent need to assess the emissions associated with the signage industry and for controlling these by adopting different mitigation techniques without sacrificing the efficiency of the project. The present paper investigates the embodied carbon of two case studies in the Australian signage industry within the cradle – gate (A1-A3) and gate–site (A4-A5) stages. A material source-based database is considered to achieve more accuracy. The study identified that aluminium is the major contributor to embodied carbon in the signage industry compared to other constituents. Finally, an attempt is made to suggest strategies for mitigating embodied carbon in this industry.

Keywords: carbon accounting, small-scale construction, signage industry, construction materials

Procedia PDF Downloads 101
11406 Investigating the Associative Network of Color Terms among Turkish University Students: A Cognitive-Based Study

Authors: R. Güçlü, E. Küçüksakarya

Abstract:

Word association (WA) gives the broadest information on how knowledge is structured in the human mind. Cognitive linguistics, psycholinguistics, and applied linguistics are the disciplines that consider WA tests as substantial in gaining insights into the very nature of the human cognitive system and semantic knowledge. In this study, Berlin and Kay’s basic 11 color terms (1969) are presented as the stimuli words to a total number of 300 Turkish university students. The responses are analyzed according to Fitzpatrick’s model (2007), including four categories, namely meaning-based responses, position-based responses, form-based responses, and erratic responses. In line with the findings, the responses to free association tests are expected to give much information about Turkish university students’ psychological structuring of vocabulary, especially morpho-syntactic and semantic relationships among words. To conclude, theoretical and practical implications are discussed to make an in-depth evaluation of how associations of basic color terms are represented in the mental lexicon of Turkish university students.

Keywords: color term, gender, mental lexicon, word association task

Procedia PDF Downloads 113
11405 Detection of Flood Prone Areas Using Multi Criteria Evaluation, Geographical Information Systems and Fuzzy Logic. The Ardas Basin Case

Authors: Vasileiou Apostolos, Theodosiou Chrysa, Tsitroulis Ioannis, Maris Fotios

Abstract:

The severity of extreme phenomena is due to their ability to cause severe damage in a small amount of time. It has been observed that floods affect the greatest number of people and induce the biggest damage when compared to the total of annual natural disasters. The detection of potential flood-prone areas constitutes one of the fundamental components of the European Natural Disaster Management Policy, directly connected to the European Directive 2007/60. The aim of the present paper is to develop a new methodology that combines geographical information, fuzzy logic and multi-criteria evaluation methods so that the most vulnerable areas are defined. Therefore, ten factors related to geophysical, morphological, climatological/meteorological and hydrological characteristics of the basin were selected. Afterwards, two models were created to detect the areas pronest to flooding. The first model defined the gravitas of each factor using Analytical Hierarchy Process (AHP) and the final map of possible flood spots were created using GIS and Boolean Algebra. The second model made use of the fuzzy logic and GIS combination and a respective map was created. The application area of the aforementioned methodologies was in Ardas basin due to the frequent and important floods that have taken place these last years. Then, the results were compared to the already observed floods. The result analysis shows that both models can detect with great precision possible flood spots. As the fuzzy logic model is less time-consuming, it is considered the ideal model to apply to other areas. The said results are capable of contributing to the delineation of high risk areas and to the creation of successful management plans dealing with floods.

Keywords: analytical hierarchy process, flood prone areas, fuzzy logic, geographic information system

Procedia PDF Downloads 366
11404 Multimedia Container for Autonomous Car

Authors: Janusz Bobulski, Mariusz Kubanek

Abstract:

The main goal of the research is to develop a multimedia container structure containing three types of images: RGB, lidar and infrared, properly calibrated to each other. An additional goal is to develop program libraries for creating and saving this type of file and for restoring it. It will also be necessary to develop a method of data synchronization from lidar and RGB cameras as well as infrared. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. Autonomous cars are increasingly breaking into our consciousness. No one seems to have any doubts that self-driving cars are the future of motoring. Manufacturers promise that moving the first of them to showrooms is the prospect of the next few years. Many experts believe that creating a network of communicating autonomous cars will be able to completely eliminate accidents. However, to make this possible, it is necessary to develop effective methods of detection of objects around the moving vehicle. In bad weather conditions, this task is difficult on the basis of the RGB(red, green, blue) image. Therefore, in such situations, you should be supported by information from other sources, such as lidar or infrared cameras. The problem is the different data formats that individual types of devices return. In addition to these differences, there is a problem with the synchronization of these data and the formatting of this data. The goal of the project is to develop a file structure that could be containing a different type of data. This type of file is calling a multimedia container. A multimedia container is a container that contains many data streams, which allows you to store complete multimedia material in one file. Among the data streams located in such a container should be indicated streams of images, films, sounds, subtitles, as well as additional information, i.e., metadata. This type of file could be used in autonomous vehicles, which would certainly facilitate data processing by the intelligent autonomous vehicle management system. As shown by preliminary studies, the use of combining RGB and InfraRed images with Lidar data allows for easier data analysis. Thanks to this application, it will be possible to display the distance to the object in a color photo. Such information can be very useful for drivers and for systems in autonomous cars.

Keywords: an autonomous car, image processing, lidar, obstacle detection

Procedia PDF Downloads 211
11403 Assessment of the Effects of Water Harvesting Technology on Downstream Water Availability Using SWAT Model

Authors: Ayalkibet Mekonnen, Adane Abebe

Abstract:

In hydrological cycle there are many water-related human interventions that modify the natural systems. Rainwater harvesting is one such intervention that involves harnessing of water in the upstream. Water harvesting used in upstream prevents water runoff on downstream mainly disturbance on biodiversity and ecosystems. The main objectives of the study are to assess the effects of water harvesting technologies on downstream water availability in the Woreda. To address the above problem, SWAT model, cost-benefit ratio and optimal control approach was used to analyse the hydrological and socioeconomic impact and tradeoffs on water availability of the community, respectively. The downstream impacts of increasing water consumption in the upstream rain-fed areas of the Bilate and Shala Catchment are simulated using the semi-distributed SWAT model. The two land use scenarios tested at sub basin levels (1) conventional land use represents the current land use practice (Agri-CON) and (2) in-field rainwater harvesting (IRWH), improving soil water availability through rainwater harvesting land use scenario. The simulated water balance results showed that the highest peak mean monthly direct flow obtained from Agri-CON land use (127.1 m3/ha), followed by Agri-IRWH land use (11.5 mm) and LULC 2005 (90.1 m3/ha). The Agri-IRWH scenario reduced direct flow by 10% compared to Agri-CON and more groundwater flow contributed by Agri-IRWH (190 m3/ha) than Agri-CON (125 m3/ha). The overall result suggests that the water yield of the Woreda may not be negatively affected by the Agri-IRWH land use scenario. The technology in the Woreda benefited positively having an average benefit cost ratio of 4.2. Water harvesting for domestic use was not optimal that the value of the water per demand harvested was less than the amount of water needed. Storage tanks, series of check dams, gravel filled dams are an alternative solutions for water harvesting.

Keywords: water harvesting, SWAT model, land use scenario, Agri-CON, Agri-IRWH, trade off, benefit cost ratio

Procedia PDF Downloads 323
11402 Human Action Retrieval System Using Features Weight Updating Based Relevance Feedback Approach

Authors: Munaf Rashid

Abstract:

For content-based human action retrieval systems, search accuracy is often inferior because of the following two reasons 1) global information pertaining to videos is totally ignored, only low level motion descriptors are considered as a significant feature to match the similarity between query and database videos, and 2) the semantic gap between the high level user concept and low level visual features. Hence, in this paper, we propose a method that will address these two issues and in doing so, this paper contributes in two ways. Firstly, we introduce a method that uses both global and local information in one framework for an action retrieval task. Secondly, to minimize the semantic gap, a user concept is involved by incorporating features weight updating (FWU) Relevance Feedback (RF) approach. We use statistical characteristics to dynamically update weights of the feature descriptors so that after every RF iteration feature space is modified accordingly. For testing and validation purpose two human action recognition datasets have been utilized, namely Weizmann and UCF. Results show that even with a number of visual challenges the proposed approach performs well.

Keywords: relevance feedback (RF), action retrieval, semantic gap, feature descriptor, codebook

Procedia PDF Downloads 456
11401 Application of a Lighting Design Method Using Mean Room Surface Exitance

Authors: Antonello Durante, James Duff, Kevin Kelly

Abstract:

The visual needs of people in modern work based buildings are changing. Self-illuminated screens of computers, televisions, tablets and smart phones have changed the relationship between people and the lit environment. In the past, lighting design practice was primarily based on providing uniform horizontal illuminance on the working plane, but this has failed to ensure good quality lit environments. Lighting standards of today continue to be set based upon a 100 year old approach that at its core, considers the task illuminance of the utmost importance, with this task typically being located on a horizontal plane. An alternative method focused on appearance has been proposed, as opposed to the traditional performance based approach. Mean Room Surface Exitance (MRSE) and Target-Ambient Illuminance Ratio (TAIR) are two new metrics proposed to assess illumination adequacy in interiors. The hypothesis is that these factors will be superior to the existing metrics used, which are horizontal illuminance led. For the six past years, research has examined this, within the Dublin Institute of Technology, with a view to determining the suitability of this approach for application to general lighting practice. Since the start of this research, a number of key findings have been produced that centered on how occupants will react to various levels of MRSE. This paper provides a broad update on how this research has progressed. More specifically, this paper will: i) Demonstrate how MRSE can be measured using HDR images technology, ii) Illustrate how MRSE can be calculated using scripting and an open source lighting computation engine, iii) Describe experimental results that demonstrate how occupants have reacted to various levels of MRSE within experimental office environments.

Keywords: illumination hierarchy (IH), mean room surface exitance (MRSE), perceived adequacy of illumination (PAI), target-ambient illumination ratio (TAIR)

Procedia PDF Downloads 174
11400 Role-Governed Categorization and Category Learning as a Result from Structural Alignment: The RoleMap Model

Authors: Yolina A. Petrova, Georgi I. Petkov

Abstract:

The paper presents a symbolic model for category learning and categorization (called RoleMap). Unlike the other models which implement learning in a separate working mode, role-governed category learning and categorization emerge in RoleMap while it does its usual reasoning. The model is based on several basic mechanisms known as reflecting the sub-processes of analogy-making. It steps on the assumption that in their everyday life people constantly compare what they experience and what they know. Various commonalities between the incoming information (current experience) and the stored one (long-term memory) emerge from those comparisons. Some of those commonalities are considered to be highly important, and they are transformed into concepts for further use. This process denotes the category learning. When there is missing knowledge in the incoming information (i.e. the perceived object is still not recognized), the model makes anticipations about what is missing, based on the similar episodes from its long-term memory. Various such anticipations may emerge for different reasons. However, with time only one of them wins and is transformed into a category member. This process denotes the act of categorization.

Keywords: analogy-making, categorization, category learning, cognitive modeling, role-governed categories

Procedia PDF Downloads 131
11399 Affects Associations Analysis in Emergency Situations

Authors: Joanna Grzybowska, Magdalena Igras, Mariusz Ziółko

Abstract:

Association rule learning is an approach for discovering interesting relationships in large databases. The analysis of relations, invisible at first glance, is a source of new knowledge which can be subsequently used for prediction. We used this data mining technique (which is an automatic and objective method) to learn about interesting affects associations in a corpus of emergency phone calls. We also made an attempt to match revealed rules with their possible situational context. The corpus was collected and subjectively annotated by two researchers. Each of 3306 recordings contains information on emotion: (1) type (sadness, weariness, anxiety, surprise, stress, anger, frustration, calm, relief, compassion, contentment, amusement, joy) (2) valence (negative, neutral, or positive) (3) intensity (low, typical, alternating, high). Also, additional information, that is a clue to speaker’s emotional state, was annotated: speech rate (slow, normal, fast), characteristic vocabulary (filled pauses, repeated words) and conversation style (normal, chaotic). Exponentially many rules can be extracted from a set of items (an item is a previously annotated single information). To generate the rules in the form of an implication X → Y (where X and Y are frequent k-itemsets) the Apriori algorithm was used - it avoids performing needless computations. Then, two basic measures (Support and Confidence) and several additional symmetric and asymmetric objective measures (e.g. Laplace, Conviction, Interest Factor, Cosine, correlation coefficient) were calculated for each rule. Each applied interestingness measure revealed different rules - we selected some top rules for each measure. Owing to the specificity of the corpus (emergency situations), most of the strong rules contain only negative emotions. There are though strong rules including neutral or even positive emotions. Three examples of the strongest rules are: {sadness} → {anxiety}; {sadness, weariness, stress, frustration} → {anger}; {compassion} → {sadness}. Association rule learning revealed the strongest configurations of affects (as well as configurations of affects with affect-related information) in our emergency phone calls corpus. The acquired knowledge can be used for prediction to fulfill the emotional profile of a new caller. Furthermore, a rule-related possible context analysis may be a clue to the situation a caller is in.

Keywords: data mining, emergency phone calls, emotional profiles, rules

Procedia PDF Downloads 399
11398 The Routes of Human Suffering: How Point-Source and Destination-Source Mapping Can Help Victim Services Providers and Law Enforcement Agencies Effectively Combat Human Trafficking

Authors: Benjamin Thomas Greer, Grace Cotulla, Mandy Johnson

Abstract:

Human trafficking is one of the fastest growing international crimes and human rights violations in the world. The United States Department of State (State Department) approximates some 800,000 to 900,000 people are annually trafficked across sovereign borders, with approximately 14,000 to 17,500 of these people coming into the United States. Today’s slavery is conducted by unscrupulous individuals who are often connected to organized criminal enterprises and transnational gangs, extracting huge monetary sums. According to the International Labour Organization (ILO), human traffickers collect approximately $32 billion worldwide annually. Surpassed only by narcotics dealing, trafficking of humans is tied with illegal arms sales as the second largest criminal industry in the world and is the fastest growing field in the 21st century. Perpetrators of this heinous crime abound. They are not limited to single or “sole practitioners” of human trafficking, but rather, often include Transnational Criminal Organizations (TCO), domestic street gangs, labor contractors, and otherwise seemingly ordinary citizens. Monetary gain is being elevated over territorial disputes and street gangs are increasingly operating in a collaborative effort with TCOs to further disguise their criminal activity; to utilizing their vast networks, in an attempt to avoid detection. Traffickers rely on a network of clandestine routes to sell their commodities with impunity. As law enforcement agencies seek to retard the expansion of transnational criminal organization’s entry into human trafficking, it is imperative that they develop reliable trafficking mapping of known exploitative routes. In a recent report given to the Mexican Congress, The Procuraduría General de la República (PGR) disclosed, from 2008 to 2010 they had identified at least 47 unique criminal networking routes used to traffic victims and that Mexico’s estimated domestic victims number between 800,000 adults and 20,000 children annually. Designing a reliable mapping system is a crucial step to effective law enforcement response and deploying a successful victim support system. Creating this mapping analytic is exceedingly difficult. Traffickers are constantly changing the way they traffic and exploit their victims. They swiftly adapt to local environmental factors and react remarkably well to market demands, exploiting limitations in the prevailing laws. This article will highlight how human trafficking has become one of the fastest growing and most high profile human rights violations in the world today; compile current efforts to map and illustrate trafficking routes; and will demonstrate how the proprietary analytical mapping analysis of point-source and destination-source mapping can help local law enforcement, governmental agencies and victim services providers effectively respond to the type and nature of trafficking to their specific geographical locale. Trafficking transcends state and international borders. It demands an effective and consistent cooperation between local, state, and federal authorities. Each region of the world has different impact factors which create distinct challenges for law enforcement and victim services. Our mapping system lays the groundwork for a targeted anti-trafficking response.

Keywords: human trafficking, mapping, routes, law enforcement intelligence

Procedia PDF Downloads 370
11397 A Review on the Vulnerability of Rural-Small Scale Farmers to Insect Pest Attacks in the Eastern Cape Province, South Africa

Authors: Nolitha L. Skenjana, Bongani P. Kubheka, Maxwell A. Poswal

Abstract:

The Eastern Cape Province of South Africa is characterized by subsistence farming, which is mostly distributed in the rural areas of the province. It is estimated that cereal crops such as maize and sorghum, and vegetables such as cabbage are grown in more than 400.000 rural households, with maize being the most dominant crop. However, compared to commercial agriculture, small-scale farmers receive minimal support from research and development, limited technology transfer on the latest production practices and systems and have poor production infrastructure and equipment. Similarly, there is limited farmers' appreciation on best practices in insect pest management and control. The paper presents findings from the primary literature and personal observations on insect pest management practices of small-scale farmers in the province. Inferences from literature and personal experiences in the production areas have led to a number of deductions regarding the level of exposure and extent of vulnerability. Farmers' pest management practices, which included not controlling at all though there is a pest problem, resulted in their crop stands to be more vulnerable to pest attacks. This became more evident with the recent brown locust, African armyworm, and Fall armyworm outbreaks, and with the incidences of opportunistic phytophagous insects previously collected on wild hosts only, found causing serious damages on crops. In most of these occurrences, damage to crops resulted in low or no yield. Improvements on farmers' reaction and response to pest problems were only observed in areas where focused awareness campaigns and trainings on specific pests and their management techniques were done. This then calls for a concerted effort from all role players in the sphere of small-scale crop production, to train and equip farmers with relevant skills, and provide them with information on affordable and climate-smart strategies and technologies in order to create a state of preparedness. This is necessary for the prevention of substantial crop losses that may exacerbate food insecurity in the province.

Keywords: Eastern Cape Province, small-scale farmers, insect pest management, vulnerability

Procedia PDF Downloads 129
11396 Sustainable Technologies for Decommissioning of Nuclear Facilities

Authors: Ahmed Stifi, Sascha Gentes

Abstract:

The German nuclear industry, while implementing the German policy, believes that the journey towards the green-field, namely phasing out of nuclear energy, should be achieved through green techniques. The most important techniques required for the wide range of decommissioning activities are decontamination techniques, cutting techniques, radioactivity measuring techniques, remote control techniques, techniques for worker and environmental protection and techniques for treating, preconditioning and conditioning nuclear waste. Many decontamination techniques are used for removing contamination from metal, concrete or other surfaces like the scales inside pipes. As the pipeline system is one of the important components of nuclear power plants, the process of decontamination in tubing is of more significance. The development of energy sectors like oil sector, gas sector and nuclear sector, since the middle of 20th century, increased the pipeline industry and the research in the decontamination of tubing in each sector is found to serve each other. The extraction of natural products and material through the pipeline can result in scale formation. These scales can be radioactively contaminated through an accumulation process especially in the petrochemical industry when oil and gas are extracted from the underground reservoir. The radioactivity measured in these scales can be significantly high and pose a great threat to people and the environment. At present, the decontamination process involves using high pressure water jets with or without abrasive material and this technology produces a high amount of secondary waste. In order to overcome it, the research team within Karlsruhe Institute of Technology developed a new sustainable method to carry out the decontamination of tubing without producing any secondary waste. This method is based on vibration technique which removes scales and also does not require any auxiliary materials. The outcome of the research project proves that the vibration technique used for decontamination of tubing is environmental friendly in other words a sustainable technique.

Keywords: sustainable technologies, decontamination, pipeline, nuclear industry

Procedia PDF Downloads 294
11395 Development of Strategy for Enhanced Production of Industrial Enzymes by Microscopic Fungi in Submerged Fermentation

Authors: Zhanara Suleimenova, Raushan Blieva, Aigerim Zhakipbekova, Inkar Tapenbayeva, Zhanar Narmuratova

Abstract:

Green processes are based on innovative technologies that do not negatively affect the environment. Industrial enzymes originated from biological systems can effectively contribute to sustainable development through being isolated from microorganisms which are fermented using primarily renewable resources. Many widespread microorganisms secrete a significant amount of biocatalysts into the environment, which greatly facilitates the task of their isolation and purification. The ability to control the enzyme production through the regulation of their biosynthesis and the selection of nutrient media and cultivation conditions allows not only to increase the yield of enzymes but also to obtain enzymes with certain properties. In this regard, large potentialities are embedded in immobilized cells. Enzyme production technology in a secreted active form enabling industrial application on an economically feasible scale has been developed. This method is based on the immobilization of enzyme producers on a solid career. Immobilizing has a range of advantages: decreasing the price of the final product, absence of foreign substances, controlled process of enzyme-genesis, the ability of various enzymes' simultaneous production, etc. Design of proposed equipment gives the opportunity to increase the activity of immobilized cell culture filtrate comparing to free cells, growing in periodic culture conditions. Such technology allows giving a 10-times raise in culture productivity, to prolong the process of fungi cultivation and periods of active culture liquid generation. Also, it gives the way to improve the quality of filtrates (to make them more clear) and exclude time-consuming processes of recharging fermentative vials, that require manual removing of mycelium.

Keywords: industrial enzymes, immobilization, submerged fermentation, microscopic fungi

Procedia PDF Downloads 131
11394 HBTOnto: An Ontology Model for Analyzing Human Behavior Trajectories

Authors: Heba M. Wagih, Hoda M. O. Mokhtar

Abstract:

Social Network has recently played a significant role in both scientific and social communities. The growing adoption of social network applications has been a relevant source of information nowadays. Due to its popularity, several research trends are emerged to service the huge volume of users including, Location-Based Social Networks (LBSN), Recommendation Systems, Sentiment Analysis Applications, and many others. LBSNs applications are among the highly demanded applications that do not focus only on analyzing the spatiotemporal positions in a given raw trajectory but also on understanding the semantics behind the dynamics of the moving object. LBSNs are possible means of predicting human mobility based on users social ties as well as their spatial preferences. LBSNs rely on the efficient representation of users’ trajectories. Hence, traditional raw trajectory information is no longer convenient. In our research, we focus on studying human behavior trajectory which is the major pillar in location recommendation systems. In this paper, we propose an ontology design patterns with their underlying description logics to efficiently annotate human behavior trajectories.

Keywords: human behavior trajectory, location-based social network, ontology, social network

Procedia PDF Downloads 440
11393 Local Spectrum Feature Extraction for Face Recognition

Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh

Abstract:

This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.

Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret

Procedia PDF Downloads 651
11392 Acclimation of in vitro-Propagated Apple Plantlets as Affected by Light Intensity

Authors: Guem-Jae Chung, Jin-Hui Lee, Myung-Min Oh

Abstract:

Environmental control of in vitro-propagated apple plantlets is required for successful acclimation to ex vitro due to its low survival rate. This study aimed to determine the proper lighting condition for ex vitro acclimation of the apple plantlets in plant factories. In vitro-propagated M9 apple plantlets treated with pre-acclimatization for 1 week were exposed to following light treatments for additional 6 weeks; 60 μmol·m⁻²·s⁻¹ (A), 100 μmol·m⁻²·s⁻¹ (B), 140 μmol·m⁻²·s⁻¹ (C), 180 μmol·m⁻²·s⁻¹ (D), 60 μmol·m⁻²·s⁻¹ → 100 μmol·m⁻²·s⁻¹ at 2 weeks (E) or 4 weeks (F), 60 μmol·m⁻²·s⁻¹ → 100 μmol·m⁻²·s⁻¹ at 2 weeks → 140 μmol·m⁻²·s⁻¹ at 4 weeks (G) and 60 μmol·m⁻²·s⁻¹ → 140 μmol·m⁻²·s⁻¹ at 4 weeks (H). Shoot height, total leaf area, soil-plant analysis development (SPAD) value, root length, fresh and dry weights of shoots and roots were measured every 2 weeks after transplanting. In addition, the photosynthetic rate was measured at 5 weeks after transplanting. At 6 weeks after transplanting, shoot height of B was significantly higher than the other treatments. SPAD value, total leaf area and root length of B and F were relatively higher than the other treatments. Root fresh weights of B, D, F, and G were relatively higher than those in the other treatments. D induced the highest value in shoot fresh weight probably due to stem hardening, but it also resulted in shoot damage in the early stage of acclimation. Photosynthetic rate at 5 weeks after the transplanting was significantly increased as the light intensity increased. These results suggest that 100 μmol·m⁻²·s⁻¹ for 6 weeks (B) or gradually increased treatment from 60 μmol·m⁻²·s⁻¹ to 140 μmol·m⁻²·s⁻¹ at 2 weeks interval (F) were the proper lighting conditions for successful acclimation of in vitro-propagated apple plantlets. Acknowledgment: This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, Forestry and Fisheries (IPET) through Agri-Bio industry Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA) (315003051SB020).

Keywords: acclimation, in vitro-propagated apple plantlets, light intensity, plant factory

Procedia PDF Downloads 118
11391 Reading Knowledge Development and Its Phases with Generation Z

Authors: Onur Özdemir, M.Erhan ORHAN

Abstract:

Knowledge Development (KD) is just one of the important phases of Knowledge Management (KM). KD is the phase in which intelligence is used to see the big picture. In order to understand whether information is important or not, we have to use the intelligence cycle that includes four main steps: aiming, collecting data, processing and utilizing. KD also needs these steps. To make a precise decision, the decision maker has to be aware of his subordinates’ ideas. If the decision maker ignores the ideas of his subordinates or participants of the organization, it is not possible for him to get the target. KD is a way of using wisdom to accumulate the puzzle. If the decision maker does not bring together the puzzle pieces, he cannot get the big picture, and this shows its effects on the battlefield. In order to understand the battlefield, the decision maker has to use the intelligence cycle. To convert information to knowledge, KD is the main means for the intelligence cycle. On the other hand, the “Z Generation” born after the millennium are really the game changers. They have different attitudes from their elders. Their understanding of life is different - the definition of freedom and independence have different meanings to them than others. Decision makers have to consider these factors and rethink their decisions accordingly. This article tries to explain the relation between KD and Generation Z. KD is the main method of target managing. But if leaders neglect their people, the world will be seeing much more movements like the Arab Spring and other insurgencies.

Keywords: knowledge development, knowledge management, generation Z, intelligence cycle

Procedia PDF Downloads 504
11390 African Personhood and the Regulation of Brain-Computer Interface (BCI) Technologies: A South African view

Authors: Meshandren Naidoo, Amy Gooden

Abstract:

Implantable brain-computer interface (BCI) technologies have developed to the point where brain-computer communication is possible. This has great potential in the medical field, as it allows persons who have lost capacities. However, ethicists and regulators call for a strict approach to these technologies due to the impact on personhood. This research demonstrates that the personhood debate is more nuanced and that where an African approach to personhood is used, it may produce results more favorable to the development and use of this technology.

Keywords: artificial intelligence, law, neuroscience, ethics

Procedia PDF Downloads 112
11389 Promoting Public Participation in the Digital Memory Project: Experience from My Peking Memory Project(MPMP)

Authors: Xiaoshuang Jia, Huiling Feng, Li Niu, Wei Hai

Abstract:

Led by Humanistic Beijing Studies Center in Renmin University of China, My Peking Memory Project(MPMP) is a long-time digital memory project under guarantee of public participation to enable the cultural and intellectual memory of Beijing to be collected, organized, preserved and promoted for discovery and research. Taking digital memory as a new way, MPMP is an important part of Peking Memory Project(PMP) which is aimed at using digital technologies to protect and (re)present the cultural heritage in Beijing. The key outcome of MPMP is the co-building of a total digital collection of knowledge assets about Beijing. Institutional memories are central to Beijing’s collection and consist of the official published documentary content of Beijing. These have already fall under the archival collection purview. The advances in information and communication technology and the knowledge form social memory theory have allowed us to collect more comprehensively beyond institutional collections. It is now possible to engage citizens on a large scale to collect private memories through crowdsourcing in digital formats. Private memories go beyond official published content to include personal narratives, some of which are just in people’s minds until they are captured by MPMP. One aim of MPMP is to engage individuals, communities, groups or institutions who have formed memories and content about Beijing, and would like to contribute them. The project hopes to build a culture of remembering and it believes ‘Every Memory Matters’. Digital memory contribution was achieved through the development of the MPMP. In reducing barriers to digital contribution and promoting high public Participation, MPMP has taken explored the harvesting of transcribe service for digital ingestion, mobile platform and some off-line activities like holding social forum. MPMP has also cooperated with the ‘Implementation Plan of Support Plan for Growth of Talents in Renmin University of China’ to get manpower and intellectual support. After six months of operation, now MPMP have more than 2000 memories added and 7 Special Memory Collections now online. The work of MPMP has ultimately helped to highlight the important role in safeguarding the documentary heritage and intellectual memory of Beijing.

Keywords: digital memory, public participation, MPMP, cultural heritage, collection

Procedia PDF Downloads 156
11388 Deep Learning Based on Image Decomposition for Restoration of Intrinsic Representation

Authors: Hyohun Kim, Dongwha Shin, Yeonseok Kim, Ji-Su Ahn, Kensuke Nakamura, Dongeun Choi, Byung-Woo Hong

Abstract:

Artefacts are commonly encountered in the imaging process of clinical computed tomography (CT) where the artefact refers to any systematic discrepancy between the reconstructed observation and the true attenuation coefficient of the object. It is known that CT images are inherently more prone to artefacts due to its image formation process where a large number of independent detectors are involved, and they are assumed to yield consistent measurements. There are a number of different artefact types including noise, beam hardening, scatter, pseudo-enhancement, motion, helical, ring, and metal artefacts, which cause serious difficulties in reading images. Thus, it is desired to remove nuisance factors from the degraded image leaving the fundamental intrinsic information that can provide better interpretation of the anatomical and pathological characteristics. However, it is considered as a difficult task due to the high dimensionality and variability of data to be recovered, which naturally motivates the use of machine learning techniques. We propose an image restoration algorithm based on the deep neural network framework where the denoising auto-encoders are stacked building multiple layers. The denoising auto-encoder is a variant of a classical auto-encoder that takes an input data and maps it to a hidden representation through a deterministic mapping using a non-linear activation function. The latent representation is then mapped back into a reconstruction the size of which is the same as the size of the input data. The reconstruction error can be measured by the traditional squared error assuming the residual follows a normal distribution. In addition to the designed loss function, an effective regularization scheme using residual-driven dropout determined based on the gradient at each layer. The optimal weights are computed by the classical stochastic gradient descent algorithm combined with the back-propagation algorithm. In our algorithm, we initially decompose an input image into its intrinsic representation and the nuisance factors including artefacts based on the classical Total Variation problem that can be efficiently optimized by the convex optimization algorithm such as primal-dual method. The intrinsic forms of the input images are provided to the deep denosing auto-encoders with their original forms in the training phase. In the testing phase, a given image is first decomposed into the intrinsic form and then provided to the trained network to obtain its reconstruction. We apply our algorithm to the restoration of the corrupted CT images by the artefacts. It is shown that our algorithm improves the readability and enhances the anatomical and pathological properties of the object. The quantitative evaluation is performed in terms of the PSNR, and the qualitative evaluation provides significant improvement in reading images despite degrading artefacts. The experimental results indicate the potential of our algorithm as a prior solution to the image interpretation tasks in a variety of medical imaging applications. This work was supported by the MISP(Ministry of Science and ICT), Korea, under the National Program for Excellence in SW (20170001000011001) supervised by the IITP(Institute for Information and Communications Technology Promotion).

Keywords: auto-encoder neural network, CT image artefact, deep learning, intrinsic image representation, noise reduction, total variation

Procedia PDF Downloads 181
11387 Assessment of Sperm Aneuploidy Using Advanced Sperm Fish Technique in Infertile Patients

Authors: Archana. S, Usha Rani. G, Anand Balakrishnan, Sanjana.R, Solomon F, Vijayalakshmi. J

Abstract:

Background: There is evidence that male factors contribute to the infertility of up to 50% of couples, who are evaluated and treated for infertility using advanced assisted reproductive technologies. Genetic abnormalities, including sperm chromosome aneuploidy as well as structural aberrations, are one of the major causes of male infertility. Recent advances in technology expedite the evaluation of sperm aneuploidy. The purpose of the study was to de-termine the prevalence of sperm aneuploidy in infertile males and the degree of association between DNA fragmentation and sperm aneuploidy. Methods: In this study, 75 infertile men were included, and they were divided into four abnormal groups (Oligospermia, Terato-spermia, Asthenospermia and Oligoasthenoteratospermia (OAT)). Men with children who were normozoospermia served as the control group. The Fluorescence in situ hybridization (FISH) method was used to test for sperm aneuploidy, and the Sperm Chromatin Dispersion Assay (SCDA) was used to measure the fragmentation of sperm DNA. Spearman's correla-tion coefficient was used to evaluate the relationship between sperm aneuploidy and sperm DNA fragmentation along with age. P < 0.05 was regarded as significant. Results: 75 partic-ipants' ages varied from 28 to 48 years old (35.5±5.1). The percentage of spermatozoa bear-ing X and Y was determined to be statistically significant (p-value < 0.05) and was found to be 48.92% and 51.18% of CEP X X 1 – nucish (CEP XX 1) [100] and CEP Y X 1 – nucish (CEP Y X 1) [100]. When compared to the rate of DNA fragmentation, it was discovered that infertile males had a greater frequency of sperm aneuploidy. Asthenospermia and OAT groups in sex chromosomal aneuploidy were significantly correlated (p<0.05). Conclusion: Sperm FISH and SCDA assay results showed increased sperm aneuploidy frequency, and DNA fragmentation index in infertile men compared with fertile men. There is a significant relationship observed between sperm aneuploidy and DNA fragmentation in OAT patients. When evaluating male variables and idiopathic infertility, the sperm FISH screening method can be used as a valuable diagnostic tool.

Keywords: ale infertility, dfi (dna fragmentation assay) (scd-sperm chromatin dispersion).art (artificial reproductive technology), trisomy, aneuploidy, fish (fluorescence in-situ hybridization), oat (oligoasthoteratospermia)

Procedia PDF Downloads 43
11386 Social Media Mining with R. Twitter Analyses

Authors: Diana Codat

Abstract:

Tweets' analysis is part of text mining. Each document is a written text. It's possible to apply the usual text search techniques, in particular by switching to the bag-of-words representation. But the tweets induce peculiarities. Some may enrich the analysis. Thus, their length is calibrated (at least as far as public messages are concerned), special characters make it possible to identify authors (@) and themes (#), the tweet and retweet mechanisms make it possible to follow the diffusion of the information. Conversely, other characteristics may disrupt the analyzes. Because space is limited, authors often use abbreviations, emoticons to express feelings, and they do not pay much attention to spelling. All this creates noise that can complicate the task. The tweets carry a lot of potentially interesting information. Their exploitation is one of the main axes of the analysis of the social networks. We show how to access Twitter-related messages. We will initiate a study of the properties of the tweets, and we will follow up on the exploitation of the content of the messages. We will work under R with the package 'twitteR'. The study of tweets is a strong focus of analysis of social networks because Twitter has become an important vector of communication. This example shows that it is easy to initiate an analysis from data extracted directly online. The data preparation phase is of great importance.

Keywords: data mining, language R, social networks, Twitter

Procedia PDF Downloads 165
11385 The Effects of Self-Efficacy on Life Satisfaction

Authors: Gao ya

Abstract:

This present study aims to find the relationship between self-efficacy and life satisfaction and the effects of self-efficacy on life satisfaction among Chinese people whose age is from 27-32, born between 1990 and 1995. People who were born between 1990 and 1995 are worthy to receive more attention now because the 90s was always received a lot of focus and labeled negatively as soon as they were born. And a large number of researches study people in individualism society more. So we chose the specific population whose age is from 27 to 32 live in a collectivist society. Demographic information was collected, including age, gender, education level, marital status, income level, number of children. We used the general self-efficacy scale(GSC) and the satisfaction with Life Scale(SLS) to collect data. A total of 350 questionnaires were distributed in and collected from mainland China, then 261 valid questionnaires were returned in the end, making a response rate of 74.57 percent. Some statistics techniques were used, like regression, correlation, ANOVA, T-test and general linear model, to measure variables. The findings were that self-efficacy positively related to life satisfaction. And self-efficacy influences life satisfaction significantly. At the same time, the relationship between demographic information and life satisfaction was analyzed.

Keywords: marital status, life satisfaction, number of children, self-efficacy, income level

Procedia PDF Downloads 110
11384 The Views of Health Care Professionals outside of the General Practice Setting on the Provision of Oral Contraception in Comparison to Long-Acting Reversible Contraception

Authors: Carri Welsby, Jessie Gunson, Pen Roe

Abstract:

Currently, there is limited research examining health care professionals (HCPs) views on long-acting reversible contraception (LARC) advice and prescription, particularly outside of the general practice (GP) setting. The aim of this study is to systematically review existing evidence around the barriers and enablers of oral contraception (OC) in comparison to LARC, as perceived by HCPs in non-GP settings. Five electronic databases were searched in April 2018 using terms related to LARC, OC, HCPs, and views, but not terms related to GPs. Studies were excluded if they concerned emergency oral contraception, male contraceptives, contraceptive use in conjunction with a health condition(s), developing countries, GPs and GP settings, were non-English or was not published before 2013. A total of six studies were included for systematic reviewing. Five key areas emerged, under which themes were categorised, including (1) understanding HCP attitudes and counselling practices towards contraceptive methods; (2) assessment of HCP attitudes and beliefs about contraceptive methods; (3) misconceptions and concerns towards contraceptive methods; and (4) influences on views, attitudes, and beliefs of contraceptive methods. Limited education and training of HCPs exists around LARC provision, particularly compared to OC. The most common misconception inhibiting HCPs contraceptive information delivery to women was the belief that LARC was inappropriate for nulliparous women. In turn, by not providing the correct information on a variety of contraceptive methods, HCP counselling practices were disempowering for women and restricted them from accessing reproductive justice. Educating HCPs to be able to provide accurate and factual information to women on all contraception is vital to encourage a woman-centered approach during contraceptive counselling and promote informed choices by women.

Keywords: advice, contraceptives, health care professionals, long acting reversible contraception, oral contraception, reproductive justice

Procedia PDF Downloads 145
11383 Educational Engineering Tool on Smartphone

Authors: Maya Saade, Rafic Younes, Pascal Lafon

Abstract:

This paper explores the transformative impact of smartphones on pedagogy and presents a smartphone application developed specifically for engineering problem-solving and educational purposes. The widespread availability and advanced capabilities of smartphones have revolutionized the way we interact with technology, including in education. The ubiquity of smartphones allows learners to access educational resources anytime and anywhere, promoting personalized and self-directed learning. The first part of this paper discusses the overall influence of smartphones on pedagogy, emphasizing their potential to improve learning experiences through mobile technology. In the context of engineering education, this paper focuses on the development of a dedicated smartphone application that serves as a powerful tool for both engineering problem-solving and education. The application features an intuitive and user-friendly interface, allowing engineering students and professionals to perform complex calculations and analyses on their smartphones. The smartphone application primarily focuses on beam calculations and serves as a comprehensive beam calculator tailored to engineering education. It caters to various engineering disciplines by offering interactive modules that allow students to learn key concepts through hands-on activities and simulations. With a primary emphasis on beam analysis, this application empowers users to perform calculations for statically determinate beams, statically indeterminate beams, and beam buckling phenomena. Furthermore, the app includes a comprehensive library of engineering formulas and reference materials, facilitating a deeper understanding and practical application of the fundamental principles in beam analysis. By offering a wide range of features specifically tailored for beam calculation, this application provides an invaluable tool for engineering students and professionals looking to enhance their understanding and proficiency in this crucial aspect of a structural engineer.

Keywords: mobile devices in education, solving engineering problems, smartphone application, engineering education

Procedia PDF Downloads 56
11382 Blockchain Solutions for IoT Challenges: Overview

Authors: Amir Ali Fatoorchi

Abstract:

Regardless of the advantage of LoT devices, they have limitations like storage, compute, and security problems. In recent years, a lot of Blockchain-based research in IoT published and presented. In this paper, we present the Security issues of LoT. IoT has three levels of security issues: Low-level, Intermediate-level, and High-level. We survey and compare blockchain-based solutions for high-level security issues and show how the underlying technology of bitcoin and Ethereum could solve IoT problems.

Keywords: Blockchain, security, data security, IoT

Procedia PDF Downloads 201
11381 Application of Building Information Modelling In Analysing IGBC® Ratings (Sustainability Analyses)

Authors: Lokesh Harshe

Abstract:

The building construction sector is using 36% of global energy consumption with 39% of CO₂ emission. Professionals in the Built Environment Sector have long been aware of the industry’s contribution towards CO₂ emissions and are now moving towards more sustainable practices. As a result of this, many organizations have introduced rating systems to address the issue of global warming in the construction sector by ranking construction projects based on sustainability parameters. The pre-construction phase of any building project is the most essential time to make decisions for addressing the sustainability aspects. Traditionally, it is very difficult to collect data from different stakeholders and bring it together to form a decision based on factual data to perform sustainability analyses in the pre-construction phase. Building Information Modelling (BIM) is the solution where one single model is the result of the collaborative approach of BIM processes where all the information is shared, extracted, communicated, and stored on a single platform that everyone can access and make decisions based on real-time data. The focus of this research is on the Indian Green Rating System IGBC® with the objective of understanding IGBC® requirements and developing a framework to create the relationship between the rating processes and BIM. A Hypothetical (Architectural) model of a hostel building is developed using AutoCAD 2019 & Revit Arch. 2019, where the framework is applied to generate results on sustainability analysis using Green Building Studio (GBS) and Revit Add-ins. The results of any sustainability analysis are generated within a fraction of a minute, which is very quick in comparison with traditional sustainability analysis. This may save a considerable amount of time as well as cost. The future scope is to integrate Architectural, Structural, and MEP Models to perform accurate sustainability analyses with inputs from industry professionals working on real-life Green BIM projects.

Keywords: sustainability analyses, BIM, green rating systems, IGBC®, LEED

Procedia PDF Downloads 39
11380 Women’s Colours in Digital Innovation

Authors: Daniel J. Patricio Jiménez

Abstract:

Digital reality demands new ways of thinking, flexibility in learning, acquisition of new competencies, visualizing reality under new approaches, generating open spaces, understanding dimensions in continuous change, etc. We need inclusive growth, where colors are not lacking, where lights do not give a distorted reality, where science is not half-truth. In carrying out this study, the documentary or bibliographic collection has been taken into account, providing a reflective and analytical analysis of current reality. In this context, deductive and inductive methods have been used on different multidisciplinary information sources. Women today and tomorrow are a strategic element in science and arts, which, under the umbrella of sustainability, implies ‘meeting current needs without detriment to future generations’. We must build new scenarios, which qualify ‘the feminine and the masculine’ as an inseparable whole, encouraging cooperative behavior; nothing is exclusive or excluding, and that is where true respect for diversity must be based. We are all part of an ecosystem, which we will make better as long as there is a real balance in terms of gender. It is the time of ‘the lifting of the veil’, in other words, it is the time to discover the pseudonyms, the women who painted, wrote, investigated, recorded advances, etc. However, the current reality demands much more; we must remove doors where they are not needed. Mass processing of data, big data, needs to incorporate algorithms under the perspective of ‘the feminine’. However, most STEM students (science, technology, engineering, and math) are men. Our way of doing science is biased, focused on honors and short-term results to the detriment of sustainability. Historically, the canons of beauty, the way of looking, of perceiving, of feeling, depended on the circumstances and interests of each moment, and women had no voice in this. Parallel to science, there is an under-representation of women in the arts, but not so much in the universities, but when we look at galleries, museums, art dealers, etc., colours impoverish the gaze and once again highlight the gender gap and the silence of the feminine. Art registers sensations by divining the future, science will turn them into reality. The uniqueness of the so-called new normality requires women to be protagonists both in new forms of emotion and thought, and in the experimentation and development of new models. This will result in women playing a decisive role in the so-called "5.0 society" or, in other words, in a more sustainable, more humane world.

Keywords: art, digitalization, gender, science

Procedia PDF Downloads 158
11379 Human Action Recognition Using Wavelets of Derived Beta Distributions

Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel

Abstract:

In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.

Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet

Procedia PDF Downloads 402