Search results for: original introject
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1459

Search results for: original introject

649 Igbo Art: A Reflection of the Igbo’s Visual Culture

Authors: David Osa-Egonwa

Abstract:

Visual culture is the expression of the norms and social behavior of a society in visual images. A reflection simply shows you how you look when you stand before a mirror, a clear water or stream. The mirror does not alter, improve or distort your original appearance, neither does it show you a caricature of what stands before it, this is the case with visual images created by a tribe or society. The ‘uli’ is hand drawn body design done on Igbo women and speaks of a culture of body adornment which is a practice that is appreciated by that tribe. The use of pattern of the gliding python snake ‘ije eke’ or ‘ijeagwo’ for wall painting speaks of the Igbo culture as one that appreciates wall paintings based on these patterns. Modern life came and brought a lot of change to the Igbo-speaking people of Nigeria. Change cloaked in the garment of Westernization has influenced the culture of the Igbos. This has resulted in a problem which is a break in the cultural practice that has also affected art produced by the Igbos. Before the colonial masters arrived and changed the established culture practiced by the Igbos, visual images were created that retained the culture of this people. To bring this point to limelight, this paper has adopted a historical method. A large number of works produced during pre and post-colonial era which range from sculptural pieces, paintings and other artifacts, just to mention a few, were studied carefully and it was discovered that the visual images hold the culture or aspects of the culture of the Igbos in their renditions and can rightly serve as a mirror of the Igbo visual culture.

Keywords: artistic renditions, historical method, Igbo visual culture, changes

Procedia PDF Downloads 161
648 Bayesian Inference of Physicochemical Quality Elements of Tropical Lagoon Nokoué (Benin)

Authors: Hounyèmè Romuald, Maxime Logez, Mama Daouda, Argillier Christine

Abstract:

In view of the very strong degradation of aquatic ecosystems, it is urgent to set up monitoring systems that are best able to report on the effects of the stresses they undergo. This is particularly true in developing countries, where specific and relevant quality standards and funding for monitoring programs are lacking. The objective of this study was to make a relevant and objective choice of physicochemical parameters informative of the main stressors occurring on African lakes and to identify their alteration thresholds. Based on statistical analyses of the relationship between several driving forces and the physicochemical parameters of the Nokoué lagoon, relevant Physico-chemical parameters were selected for its monitoring. An innovative method based on Bayesian statistical modeling was used. Eleven Physico-chemical parameters were selected for their response to at least one stressor and their threshold quality standards were also established: Total Phosphorus (<4.5mg/L), Orthophosphates (<0.2mg/L), Nitrates (<0.5 mg/L), TKN (<1.85 mg/L), Dry Organic Matter (<5 mg/L), Dissolved Oxygen (>4 mg/L), BOD (<11.6 mg/L), Salinity (7.6 .), Water Temperature (<28.7 °C), pH (>6.2), and Transparency (>0.9 m). According to the System for the Evaluation of Coastal Water Quality, these thresholds correspond to” good to medium” suitability classes, except for total phosphorus. One of the original features of this study is the use of the bounds of the credibility interval of the fixed-effect coefficients as local weathering standards for the characterization of the Physico-chemical status of this anthropized African ecosystem.

Keywords: driving forces, alteration thresholds, acadjas, monitoring, modeling, human activities

Procedia PDF Downloads 72
647 A Study on the Disclosure Experience of Adoptees

Authors: Tsung Chieh Ma, I-Ling Chen

Abstract:

Disclosing family origins to adoptees is an important topic in the adoption process. Adoption agencies usually educate adoptive parents on how to disclose to adoptees, but many adoptive parents worry that the disclosure will affect the parent–child relationship. Thus, how adoptees would like to receive the disclosure and whether they subjectively feel that the parent–child relationship is affected are both topics worthy of further discussion. This research takes a qualitative approach and connects with adoption agencies to interview six adoptees who are now adults. The purpose of the interviews is to learn about their experience receiving disclosures and their subjective feelings after learning of their family origins. The aim is to reveal the changes disclosure brought to the parent–child relationship and whether common concerns are raised due to the adoptive status. We also want to know about factors that affect their identification with their adopted status so that we can consequently give advice to other adoptive families. in this study finds that adoptees see disclosure as a process rather than an isolated event. The majority want to be told their family origin as early and proactively as possible and expect to learn the reasons they were given up for adoption and taken in as adoptees. The disclosure does not necessarily influence the parent–child relationship, and adoptees care more about the positive experiences they had with adoptive parents in their childhood. Moreover, adopted children seek contact with their original family mostly to understand why they were given up for adoption. The effects of disclosure depend on how the adoptive parents or other significant people in the lives of adoptees interpret the identity of the adoptees. That is, their response and attitude toward the identity have a lasting impact on the adoptees. The study suggests that early disclosure gives adoptees a chance to internalize the experience in the process and find self-identification.

Keywords: adoption, adoptees, disclosure of family origins, parent–child relationship, self-identity

Procedia PDF Downloads 52
646 Theoretical Modeling of Self-Healing Polymers Crosslinked by Dynamic Bonds

Authors: Qiming Wang

Abstract:

Dynamic polymer networks (DPNs) crosslinked by dynamic bonds have received intensive attention because of their special crack-healing capability. Diverse DPNs have been synthesized using a number of dynamic bonds, including dynamic covalent bond, hydrogen bond, ionic bond, metal-ligand coordination, hydrophobic interaction, and others. Despite the promising success in the polymer synthesis, the fundamental understanding of their self-healing mechanics is still at the very beginning. Especially, a general analytical model to understand the interfacial self-healing behaviors of DPNs has not been established. Here, we develop polymer-network based analytical theories that can mechanistically model the constitutive behaviors and interfacial self-healing behaviors of DPNs. We consider that the DPN is composed of interpenetrating networks crosslinked by dynamic bonds. bonds obey a force-dependent chemical kinetics. During the self-healing process, we consider the The network chains follow inhomogeneous chain-length distributions and the dynamic polymer chains diffuse across the interface to reform the dynamic bonds, being modeled by a diffusion-reaction theory. The theories can predict the stress-stretch behaviors of original and self-healed DPNs, as well as the healing strength in a function of healing time. We show that the theoretically predicted healing behaviors can consistently match the documented experimental results of DPNs with various dynamic bonds, including dynamic covalent bonds (diarylbibenzofuranone and olefin metathesis), hydrogen bonds, and ionic bonds. We expect our model to be a powerful tool for the self-healing community to invent, design, understand, and optimize self-healing DPNs with various dynamic bonds.

Keywords: self-healing polymers, dynamic covalent bonds, hydrogen bonds, ionic bonds

Procedia PDF Downloads 168
645 Stabilization of Medical Waste Incineration Fly Ash in Cement Mortar Matrix

Authors: Tanvir Ahmed, Musfira Rahman, Rumpa Chowdhury

Abstract:

We performed laboratory experiments to assess the suitability of using medical waste incineration fly ash in cement as a construction material based on the engineering properties of fly ash-cement matrix and the leaching potential of toxic heavy metals from the stabilized mix. Fly ash-cement samples were prepared with different proportions of fly ash (0%, 5%, 10%, 15% and 20% by weight) in the laboratory controlled conditions. The solidified matrix exhibited a compressive strength from 3950 to 4980 psi when fly ash is mixed in varying proportions. The 28-day compressive strength has been found to decrease with the increase in fly ash content, but it meets the minimum requirement of compressive strength for cement-mortar. Soundness test results for cement-mortar mixes having up to 15% fly ash. Final and initial setting times of cement have been found to generally increase with fly ash content. Water requirement (for normal consistency) also increased with the increase in fly ash content in cement. Based on physical properties of the cement-mortar matrix it is recommended that up to 10% (by weight) medical waste incineration fly ash can be incorporated for producing cement-mortar of optimum quality. Leaching behaviours of several targeted heavy metals (As, Cu, Ni, Cd, Pb, Hg and Zn) were analyzed using Toxicity Characteristics Leaching Procedure (TCLP) on fly ash and solidified fly ash-cement matrix. It was found that the leached concentrations of As, Cu, Cd, Pb and Zn were reduced by 80.13%, 89.47%, 33.33% and 23.9% respectively for 10% fly ash incorporated cement-mortar matrix compared to that of original fly ash. The leached concentrations of heavy metals were from the matrix were far below the EPA land disposal limits. These results suggest that the solidified fly ash incorporated cement-mortar matrix can effectively confine and immobilize the heavy metals contained in the fly ash.

Keywords: cement-mortar, fly ash, leaching, waste management

Procedia PDF Downloads 150
644 An Assessment of Different Blade Tip Timing (BTT) Algorithms Using an Experimentally Validated Finite Element Model Simulator

Authors: Mohamed Mohamed, Philip Bonello, Peter Russhard

Abstract:

Blade Tip Timing (BTT) is a technology concerned with the estimation of both frequency and amplitude of rotating blades. A BTT system comprises two main parts: (a) the arrival time measurement system, and (b) the analysis algorithms. Simulators play an important role in the development of the analysis algorithms since they generate blade tip displacement data from the simulated blade vibration under controlled conditions. This enables an assessment of the performance of the different algorithms with respect to their ability to accurately reproduce the original simulated vibration. Such an assessment is usually not possible with real engine data since there is no practical alternative to BTT for blade vibration measurement. Most simulators used in the literature are based on a simple spring-mass-damper model to determine the vibration. In this work, a more realistic experimentally validated simulator based on the Finite Element (FE) model of a bladed disc (blisk) is first presented. It is then used to generate the necessary data for the assessment of different BTT algorithms. The FE modelling is validated using both a hammer test and two firewire cameras for the mode shapes. A number of autoregressive methods, fitting methods and state-of-the-art inverse methods (i.e. Russhard) are compared. All methods are compared with respect to both synchronous and asynchronous excitations with both single and simultaneous frequencies. The study assesses the applicability of each method for different conditions of vibration, amount of sampling data, and testing facilities, according to its performance and efficiency under these conditions.

Keywords: blade tip timing, blisk, finite element, vibration measurement

Procedia PDF Downloads 292
643 Generative Adversarial Network Based Fingerprint Anti-Spoofing Limitations

Authors: Yehjune Heo

Abstract:

Fingerprint Anti-Spoofing approaches have been actively developed and applied in real-world applications. One of the main problems for Fingerprint Anti-Spoofing is not robust to unseen samples, especially in real-world scenarios. A possible solution will be to generate artificial, but realistic fingerprint samples and use them for training in order to achieve good generalization. This paper contains experimental and comparative results with currently popular GAN based methods and uses realistic synthesis of fingerprints in training in order to increase the performance. Among various GAN models, the most popular StyleGAN is used for the experiments. The CNN models were first trained with the dataset that did not contain generated fake images and the accuracy along with the mean average error rate were recorded. Then, the fake generated images (fake images of live fingerprints and fake images of spoof fingerprints) were each combined with the original images (real images of live fingerprints and real images of spoof fingerprints), and various CNN models were trained. The best performances for each CNN model, trained with the dataset of generated fake images and each time the accuracy and the mean average error rate, were recorded. We observe that current GAN based approaches need significant improvements for the Anti-Spoofing performance, although the overall quality of the synthesized fingerprints seems to be reasonable. We include the analysis of this performance degradation, especially with a small number of samples. In addition, we suggest several approaches towards improved generalization with a small number of samples, by focusing on what GAN based approaches should learn and should not learn.

Keywords: anti-spoofing, CNN, fingerprint recognition, GAN

Procedia PDF Downloads 173
642 Structural Damage Detection via Incomplete Model Data Using Output Data Only

Authors: Ahmed Noor Al-qayyim, Barlas Özden Çağlayan

Abstract:

Structural failure is caused mainly by damage that often occurs on structures. Many researchers focus on obtaining very efficient tools to detect the damage in structures in the early state. In the past decades, a subject that has received considerable attention in literature is the damage detection as determined by variations in the dynamic characteristics or response of structures. This study presents a new damage identification technique. The technique detects the damage location for the incomplete structure system using output data only. The method indicates the damage based on the free vibration test data by using “Two Points - Condensation (TPC) technique”. This method creates a set of matrices by reducing the structural system to two degrees of freedom systems. The current stiffness matrices are obtained from optimization of the equation of motion using the measured test data. The current stiffness matrices are compared with original (undamaged) stiffness matrices. High percentage changes in matrices’ coefficients lead to the location of the damage. TPC technique is applied to the experimental data of a simply supported steel beam model structure after inducing thickness change in one element. Where two cases are considered, the method detects the damage and determines its location accurately in both cases. In addition, the results illustrate that these changes in stiffness matrix can be a useful tool for continuous monitoring of structural safety using ambient vibration data. Furthermore, its efficiency proves that this technique can also be used for big structures.

Keywords: damage detection, optimization, signals processing, structural health monitoring, two points–condensation

Procedia PDF Downloads 345
641 Inherited Eye Diseases in Africa: A Scoping Review and Strategy for an African Longitudinal Eye Study

Authors: Bawa Yusuf Muhammad, Musa Abubakar Kana, Aminatu Abdulrahman, Kerry Goetz

Abstract:

Background: Inherited eye diseases are disorders that affect globally, 1 in 1000 people. The six main world populations have created databases containing information on eye genotypes. Aim: The aim of the scoping review was to mine and present the available information to date on the genetics of inherited eye diseases within the African continent. Method: Literature Search Strategy was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). PubMed and Google Scholar searched for articles on inherited eye diseases from inception to 20th June 2022. Both Original and review articles that report on inherited, genetic or developmental/congenital eye diseases within the African Continent were included in the research. Results: A total of 1162 citations were obtained, but only 37 articles were reviewed based on the inclusion and exclusion criteria. The highest output of publications on inherited eye diseases comes from South Africa and Tunisia (about 43%), followed by Morocco and Egypt (27%), then Sub-Saharan Africa and North Africa (13.50%), while the remaining articles (16.5%) originated from Nigeria, Ghana, Mauritania Cameroon, Zimbabwe and combined article between Zimbabwe and Cameroon. Glaucoma and inherited retinal disorders represent the most studied diseases, followed by Albinism and congenital cataracts, respectively. Conclusion: Despite the growing research from Tunisia, Morocco, Egypt and South Africa, Sub-Saharan Africa remains almost a virgin region to explore the genetics of eye diseases.

Keywords: inherited eye diseases, Africa, scoping review, longitudinal eye study

Procedia PDF Downloads 38
640 Message Authentication Scheme for Vehicular Ad-Hoc Networks under Sparse RSUs Environment

Authors: Wen Shyong Hsieh, Chih Hsueh Lin

Abstract:

In this paper, we combine the concepts of chameleon hash function (CHF) and identification based cryptography (IBC) to build a message authentication environment for VANET under sparse RSUs. Based on the CHF, TA keeps two common secrets that will be embedded to all identities to be as the evidence of mutual trusting. TA will issue one original identity to every RSU and vehicle. An identity contains one public ID and one private key. The public ID, includes three components: pseudonym, random key, and public key, is used to present one entity and can be verified to be a legal one. The private key is used to claim the ownership of the public ID. Based on the concept of IBC, without any negotiating process, a CHF pairing key multiplied by one private key and other’s public key will be used for mutually trusting and to be utilized as the session key of secure communicating between RSUs and vehicles. To help the vehicles to do message authenticating, the RSUs are assigned to response the vehicle’s temple identity request using two short time secretes that are broadcasted by TA. To light the loading of request information, one day is divided into M time slots. At every time slot, TA will broadcast two short time secretes to all valid RSUs for that time slot. Any RSU can response the temple identity request from legal vehicles. With the collected announcement of public IDs from the neighbor vehicles, a vehicle can set up its neighboring set, which includes the information about the neighbor vehicle’s temple public ID and temple CHF pairing key that can be derived by the private key and neighbor’s public key and will be used to do message authenticating or secure communicating without the help of RSU.

Keywords: Internet of Vehicles (IOV), Vehicular Ad-hoc Networks (VANETs), Chameleon Hash Function (CHF), message authentication

Procedia PDF Downloads 374
639 Determination of the Stability of Haloperidol Tablets and Phenytoin Capsules Stored in the Inpatient Dispensary System (Swisslog) by the Respective HPLC and Raman Spectroscopy Assay

Authors: Carol Yue-En Ong, Angelina Hui-Min Tan, Quan Liu, Paul Chi-Lui Ho

Abstract:

A public general hospital in Singapore has recently implemented an automated unit-dose machine in their inpatient dispensary, Swisslog, with the objective of reducing human error and improving patient safety. However, a concern in stability arises as tablets are removed from their original packaging (bottled loose tablets/capsules) and are repackaged into individual, clear plastic wrappers as unit doses in the system. Drugs that are light-sensitive and hygroscopic would be more susceptible to degradation as the wrapper does not offer full protection. Hence, this study was carried out to study the stability of haloperidol tablets and phenytoin capsules that are light-sensitive and hygroscopic respectively. Validated HPLC-UV assays were first established for quantification of these two compounds. The medications involved were put in the Swisslog and sampled every week for one month. The collected data was analysed and showed no degradation over time. This study also explored an alternative approach for drug stability determination-Raman spectroscopy. The advantage of Raman spectroscopy is its high time efficiency and non-destructive nature. The results suggest that drug degradation can indeed be detected using Raman microscopy, but further research is needed to establish this approach for quantification or qualification of compounds. NanoRam®, a portable Raman spectrocope was also used alongside Raman microscopy but was unsuccessful in detecting degradation in this study.

Keywords: drug stability, haloperidol, HPLC, phenytoin, raman spectroscopy, Swisslog

Procedia PDF Downloads 322
638 Object Oriented Fault Tree Analysis Methodology

Authors: Yi Xiong, Tao Kong

Abstract:

Traditional safety, risk and reliability analysis approaches are problem-oriented, which make it great workload when analyzing complicated and huge system, besides, too much repetitive work would to do if the analyzed system composed by many similar components. It is pressing need an object and function oriented approach to maintain high consistency with problem domain. A new approach is proposed to overcome these shortcomings of traditional approaches, the concepts: class, abstract, inheritance, polymorphism and encapsulation are introduced into FTA and establish the professional class library that the abstractions of physical objects in real word, four areas relevant information also be proposed as the establish help guide. The interaction between classes is completed by the inside or external methods that mapping the attributes to base events through fully search the knowledge base, which forms good encapsulation. The object oriented fault tree analysis system that analyze and evaluate the system safety and reliability according to the original appearance of the problem is set up, where could mapped directly from the class and object to the problem domain of the fault tree analysis. All the system failure situations can be analyzed through this bottom-up fault tree construction approach. Under this approach architecture, FTA approach is developed, which avoids the human influence of the analyst on analysis results. It reveals the inherent safety problems of analyzed system itself and provides a new way of thinking and development for safety analysis. So that object oriented technology in the field of safety applications and development, safety theory is conducive to innovation.

Keywords: FTA, knowledge base, object-oriented technology, reliability analysis

Procedia PDF Downloads 234
637 Forest Risk and Vulnerability Assessment: A Case Study from East Bokaro Coal Mining Area in India

Authors: Sujata Upgupta, Prasoon Kumar Singh

Abstract:

The expansion of large scale coal mining into forest areas is a potential hazard for the local biodiversity and wildlife. The objective of this study is to provide a picture of the threat that coal mining poses to the forests of the East Bokaro landscape. The vulnerable forest areas at risk have been assessed and the priority areas for conservation have been presented. The forested areas at risk in the current scenario have been assessed and compared with the past conditions using classification and buffer based overlay approach. Forest vulnerability has been assessed using an analytical framework based on systematic indicators and composite vulnerability index values. The results indicate that more than 4 km2 of forests have been lost from 1973 to 2016. Large patches of forests have been diverted for coal mining projects. Forests in the northern part of the coal field within 1-3 km radius around the coal mines are at immediate risk. The original contiguous forests have been converted into fragmented and degraded forest patches. Most of the collieries are located within or very close to the forests thus threatening the biodiversity and hydrology of the surrounding regions. Based on the vulnerability values estimated, it was concluded that more than 90% of the forested grids in East Bokaro are highly vulnerable to mining. The forests in the sub-districts of Bermo and Chandrapura have been identified as the most vulnerable to coal mining activities. This case study would add to the capacity of the forest managers and mine managers to address the risk and vulnerability of forests at a small landscape level in order to achieve sustainable development.

Keywords: forest, coal mining, indicators, vulnerability

Procedia PDF Downloads 374
636 Recontextualisation of Political Discourse: A Case Study of Translation of News Stories

Authors: Hossein Sabouri

Abstract:

News stories as one of the branches of political discourse has always been regarded a sensitive and challenging area. Political translators often encounter some struggles that are vitally important when it comes to deal with the political tension between the source culture and the target one. Translating news stories is of prime importance since it has widespread availability and power of defining or even changing the facts. News translation is usually more than straight transfer of source text. Like original text endeavoring to manipulate the readers’ minds with imposing their ideologies, translated text seeking to change these ideologies influenced by ideological power. In other words, translation product is not considered more than a recontextualisation of the source text. The present study examines possible criteria for occurring changes in the translation process of news stories based on the ideological and political stance of translator using theories of ‘critical discourse analysis’and ‘translation and power. Fairclough investigates the ideological issues in (political) discourse and Tymoczko studies the political and power-related engagement of the translator in the process of translation. Incorporation of Fairclough and Gentzler and Tymoczko’s theories paves the way for the researcher to looks at the ideological power position of the translator. Data collection and analysis have been accomplished using 17 political-text samples taken from online news agencies which are related to the ‘Iran’s Nuclear Program’. Based on the findings, recontextualisation is mainly observed in terms of the strategies of ‘substitution, omissions, and addition’ in the translation process. The results of the study suggest that there is a significant relationship between the translation of political texts and ideologies of target culture.

Keywords: news translation, recontextualisation, ideological power, political discourse

Procedia PDF Downloads 173
635 Effects of the Supplementary for Understanding and Preventing Plagiarism on EFL Students’ Writing

Authors: Surichai Butcha, Dararat Khampusaen

Abstract:

As the Internet is recognized as a high potential and powerful educational tool to access sources of knowledge, plagiarism is an increasing unethical issue found in students’ writing. This paper is deriving from the 1st phase of an on-going study investigating the effects of the supplementary on citing sources on undergraduate students’ writing. The 40 participants were divided into 1 experimental group and 1 control group. Both groups were administered with a questionnaire on knowledge and an interview on attitude related to using sources in writing. Only the experimental group undertook the 4 lessons focusing on using outside sources and citing the original work (quoting, synthesizing, summarizing and paraphrasing) were delivered to them via e-learning tools throughout a semester. Participants were required to produce 4 writing tasks after each lesson. The results were concerned with types and factors on using outside sources in writing of Thai undergraduate EFL students from the survey. The interview results supported and clarified the survey result. In addition, the writing rubrics confirmed the types of plagiarism frequently occurred in students’ writing. The results revealed the types and factors on plagiarism including their perceptions on using the outside sources in their writing from the interview. The discussion shed the lights on cultural dimensions of plagiarism in student writing, roles of teachers, library, and university policy on the rate of plagiarism. Also, the findings promoted the awareness on ethics in writing and prevented the rate of potential unintentional plagiarism. Additionally, the results of this phase of study could lead to the appropriate contents to be considered for inclusion in the supplementary on using sources for writing for future research.

Keywords: citing source, EFL writing, e-learning, Internet, plagiarism

Procedia PDF Downloads 135
634 Method to Find a ε-Optimal Control of Stochastic Differential Equation Driven by a Brownian Motion

Authors: Francys Souza, Alberto Ohashi, Dorival Leao

Abstract:

We present a general solution for finding the ε-optimal controls for non-Markovian stochastic systems as stochastic differential equations driven by Brownian motion, which is a problem recognized as a difficult solution. The contribution appears in the development of mathematical tools to deal with modeling and control of non-Markovian systems, whose applicability in different areas is well known. The methodology used consists to discretize the problem through a random discretization. In this way, we transform an infinite dimensional problem in a finite dimensional, thereafter we use measurable selection arguments, to find a control on an explicit form for the discretized problem. Then, we prove the control found for the discretized problem is a ε-optimal control for the original problem. Our theory provides a concrete description of a rather general class, among the principals, we can highlight financial problems such as portfolio control, hedging, super-hedging, pairs-trading and others. Therefore, our main contribution is the development of a tool to explicitly the ε-optimal control for non-Markovian stochastic systems. The pathwise analysis was made through a random discretization jointly with measurable selection arguments, has provided us with a structure to transform an infinite dimensional problem into a finite dimensional. The theory is applied to stochastic control problems based on path-dependent stochastic differential equations, where both drift and diffusion components are controlled. We are able to explicitly show optimal control with our method.

Keywords: dynamic programming equation, optimal control, stochastic control, stochastic differential equation

Procedia PDF Downloads 170
633 Color Image Compression/Encryption/Contour Extraction using 3L-DWT and SSPCE Method

Authors: Ali A. Ukasha, Majdi F. Elbireki, Mohammad F. Abdullah

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. This paper is divided into two parts. This work interests with the color image which is decomposed into red, green and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using the key image that has same original size and are generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours from color images recovery can be obtained with accepted level of distortion using single step parallel contour extraction (SSPCE) method. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Color images and completely reconstructed without any distortion. Also shown that the analyzed algorithm has extremely large security against some attacks like salt and pepper and Jpeg compression. Its proof that the color images can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: SSPCE method, image compression and salt and peppers attacks, bitplanes decomposition, Arnold transform, color image, wavelet transform, lossless image encryption

Procedia PDF Downloads 501
632 Revitalising Warsaw: The Significance of Incorporating 18th Century Art in Post-War Architecture Reconstruction

Authors: Aleksandra Kondraciuk

Abstract:

The reconstruction of post-war architecture in Warsaw is an important and complex project that requires physical restoration and cultural preservation. The incorporation of 18th-century art within the renovated structures of the urban area forms a crucial aspect of the reconstruction procedure. Information was gathered by interviewing current residents, examining additional data, and researching archival materials. This form of art was once a thriving cultural centre in Warsaw, playing a significant role in its history. Adding it to the rebuilt structures links them to the city’s vibrant past, making them more meaningful for locals and visitors. The reconstructed buildings showcase 18th-century art forms, including sketches, drawings, and paintings, accurately replicating the original buildings’ architectural details and decorative elements. These art forms elevate the buildings from mere functional spaces to works of art themselves, thus augmenting the beauty and distinctiveness of the city, setting it apart from other cities worldwide. Furthermore, this art form symbolises the city’s tenacity in adversity and destruction. Revitalising Warsaw requires rebuilding its physical structures, restoring its cultural identity, and preserving its rich history. Incorporating 18th-century art into the post-war architectural reconstruction process is a powerful way to achieve these goals and maintain the city. This approach acknowledges the city’s history and cultural significance, fostering a sense of continuity between the past and present, which is crucial for the city’s future growth and prosperity.

Keywords: 18th-century art, building reconstruction, cultural preservation, post-war architecture

Procedia PDF Downloads 56
631 Optimizing Nature Protection and Tourism in Urban Parks

Authors: Milena Lakicevic

Abstract:

The paper deals with the problem of optimizing management options for urban parks within different scenarios of nature protection and tourism importance. The procedure is demonstrated on a case study example of urban parks in Novi Sad (Serbia). Six management strategies for the selected area have been processed by the decision support method PROMETHEE. Two criteria used for the evaluation were nature protection and tourism and each of them has been divided into a set of indicators: for nature protection those were biodiversity and preservation of original landscape, while for tourism those were recreation potential, aesthetic values, accessibility and culture features. It was pre-assumed that each indicator in a set is equally important to a corresponding criterion. This way, the research was focused on a sensitivity analysis of criteria weights. In other words, weights of indicators were fixed and weights of criteria altered along the entire scale (from the value of 0 to the value of 1), and the assessment has been performed in two-dimensional surrounding. As a result, one could conclude which management strategy would be the most appropriate along with changing of criteria importance. The final ranking of management alternatives was followed up by investigating the mean PROMETHEE Φ values for all options considered and when altering the importance of nature protection/tourism. This type of analysis enabled detecting an alternative with a solid performance along the entire scale, i.e., regardlessly of criteria importance. That management strategy can be seen as a compromise solution when the weight of criteria is not defined. As a conclusion, it can be said that, in some cases, instead of having criteria importance fixed it is important to test the outputs depending on the different schemes of criteria weighting. The research demonstrates the state of the final decision when the decision maker can estimate criteria importance, but also in cases when the importance of criteria is not established or known.

Keywords: criteria weights, PROMETHEE, sensitivity analysis, urban parks

Procedia PDF Downloads 173
630 Improving the Global Competitiveness of SMEs by Logistics Transportation Management: Case Study Chicken Meat Supply Chain

Authors: P. Vanichkobchinda

Abstract:

The Logistics Transportation techniques, Open Vehicle Routing (OVR) is an approach toward transportation cost reduction, especially for long distance pickup and delivery nodes. The outstanding characteristic of OVR is that the route starting node and ending node are not necessary the same as in typical vehicle routing problems. This advantage enables the routing to flow continuously and the vehicle does not always return to its home base. This research aims to develop a heuristic for the open vehicle routing problem with pickup and delivery under time window and loading capacity constraints to minimize the total distance. The proposed heuristic is developed based on the Insertion method, which is a simple method and suitable for the rapid calculation that allows insertion of the new additional transportation requirements along the original paths. According to the heuristic analysis, cost comparisons between the proposed heuristic and companies are using method, nearest neighbor method show that the insertion heuristic. Moreover, the proposed heuristic gave superior solutions in all types of test problems. In conclusion, the proposed heuristic can effectively and efficiently solve the open vehicle routing. The research indicates that the improvement of new transport's calculation and the open vehicle routing with "Insertion Heuristic" represent a better outcome with 34.3 percent in average. in cost savings. Moreover, the proposed heuristic gave superior solutions in all types of test problems. In conclusion, the proposed heuristic can effectively and efficiently solve the open vehicle routing.

Keywords: business competitiveness, cost reduction, SMEs, logistics transportation, VRP

Procedia PDF Downloads 671
629 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 389
628 Theatrical Architecture in Bologna at the Beginning of the Twentieth Century: The Renaissance of Modernissimo Cinema

Authors: Giorgia Predari, Riccardo Gulli

Abstract:

The paper describes the history and the stylistic choices adopted in the construction of Palazzo Ronzani in Bologna, which was the first building to rise after the heavy demolitions carried out in the historical center of the city at the beginning of the twentieth century. In 1910, the local administration adopted a detailed plan to change the aspect of the city, as it was already happening in the main European capitals. In this context, starting from 1911, the architect and scenographer Gualtiero Pontoni designed for Alessandro Ronzani -the owner of a well-known Bolognese beer company- his Palazzo, which is listed among the first multifunctional buildings in Bologna, containing offices, commercial activities, and entertainment spaces. In an area of about 2000 m², the architect was able to propose a theatre with a capacity of 2000 seats at the basement, shops, a cafè-chantant and a restaurant on the ground floor, clubs, studios and commercial stores on the mezzanine and the first plan, and a hotel on the upper floors. The whole core of the building, at the underground levels, consisted of a reinforced concrete frame (one of the first examples of this type of construction in the city), which allowed the hall to have a free span of 11 x 12 meters, and a height of about 9 meters. Used until 2007 as a cinema, the hall has remained then in disuse for almost 10 years, but now an important functional restoration project with a strong architectural and scenographic value is taking place. It will bring the spaces back to the original geometries, in a historical and artistic condition inspired by the styles of the early Twentieth century.

Keywords: Modernissimo, Palazzo Ronzani, liberty, Bologna

Procedia PDF Downloads 105
627 Liver and Liver Lesion Segmentation From Abdominal CT Scans

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithm

Keywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm

Procedia PDF Downloads 433
626 Bayesian Locally Approach for Spatial Modeling of Visceral Leishmaniasis Infection in Northern and Central Tunisia

Authors: Kais Ben-Ahmed, Mhamed Ali-El-Aroui

Abstract:

This paper develops a Local Generalized Linear Spatial Model (LGLSM) to describe the spatial variation of Visceral Leishmaniasis (VL) infection risk in northern and central Tunisia. The response from each region is a number of affected children less than five years of age recorded from 1996 through 2006 from Tunisian pediatric departments and treated as a poison county level data. The model includes climatic factors, namely averages of annual rainfall, extreme values of low temperatures in winter and high temperatures in summer to characterize the climate of each region according to each continentality index, the pluviometric quotient of Emberger (Q2) to characterize bioclimatic regions and component for residual extra-poison variation. The statistical results show the progressive increase in the number of affected children in regions with high continentality index and low mean yearly rainfull. On the other hand, an increase in pluviometric quotient of Emberger contributed to a significant increase in VL incidence rate. When compared with the original GLSM, Bayesian locally modeling is improvement and gives a better approximation of the Tunisian VL risk estimation. According to the Bayesian approach inference, we use vague priors for all parameters model and Markov Chain Monte Carlo method.

Keywords: generalized linear spatial model, local model, extra-poisson variation, continentality index, visceral leishmaniasis, Tunisia

Procedia PDF Downloads 385
625 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 135
624 Development of Methods for Plastic Injection Mold Weight Reduction

Authors: Bita Mohajernia, R. J. Urbanic

Abstract:

Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.

Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction

Procedia PDF Downloads 276
623 Criminal Law Instruments to Counter Corporate Crimes in Poland

Authors: Dorota Habrat

Abstract:

In Polish law, the idea of the introduction of corporate responsibility for crimes is becoming more popular and creates a lot of questions. The need to introduce into the Polish legal system liability of corporate (collective entities) has resulted, among others, from the Polish Republic's international commitments, in particular related to membership in the European Union. The Act of 28 October 2002 on the liability of collective entities for acts prohibited under penalty is one of the example of adaptation of Polish law to Community law. Introduction to Polish law a criminal nature liability of corporations (legal persons) has resulted in a lot of controversy and lack of acceptance from both the scientific community as well as the judiciary. The responsibility of collective entities under the Act has a criminal nature. The main question concerns the ability of the collective entity to be brought to guilt under criminal law sense. Polish criminal law knows only the responsibility of individual persons. So far, guilt as a personal feature of action, based on the ability of the offender to feel in his psyche, could be considered only in relation to the individual person, while the said Act destroyed this conviction. Guilt of collective entity must be proven under at least one of the three possible forms: the guilt in the selection or supervision and so called organizational guilt. The next question is how the principle of proportionality in relation to criminal measures in response of collective entities should be considered. It should be remembered that the legal subjectivity of collective entities, including their rights and freedoms, is an emanation of the rights and freedoms of individual persons which create collective entities and through these entities implement their rights and freedoms. The adopted Act largely reflects the international legal regulations but also contains the unknown and original legislative solutions.

Keywords: criminal corporate responsibility, Polish criminal law, legislative solutions, Act of 28 October 2002

Procedia PDF Downloads 489
622 Data Augmentation for Early-Stage Lung Nodules Using Deep Image Prior and Pix2pix

Authors: Qasim Munye, Juned Islam, Haseeb Qureshi, Syed Jung

Abstract:

Lung nodules are commonly identified in computed tomography (CT) scans by experienced radiologists at a relatively late stage. Early diagnosis can greatly increase survival. We propose using a pix2pix conditional generative adversarial network to generate realistic images simulating early-stage lung nodule growth. We have applied deep images prior to 2341 slices from 895 computed tomography (CT) scans from the Lung Image Database Consortium (LIDC) dataset to generate pseudo-healthy medical images. From these images, 819 were chosen to train a pix2pix network. We observed that for most of the images, the pix2pix network was able to generate images where the nodule increased in size and intensity across epochs. To evaluate the images, 400 generated images were chosen at random and shown to a medical student beside their corresponding original image. Of these 400 generated images, 384 were defined as satisfactory - meaning they resembled a nodule and were visually similar to the corresponding image. We believe that this generated dataset could be used as training data for neural networks to detect lung nodules at an early stage or to improve the accuracy of such networks. This is particularly significant as datasets containing the growth of early-stage nodules are scarce. This project shows that the combination of deep image prior and generative models could potentially open the door to creating larger datasets than currently possible and has the potential to increase the accuracy of medical classification tasks.

Keywords: medical technology, artificial intelligence, radiology, lung cancer

Procedia PDF Downloads 51
621 Planning Water Reservoirs as Complementary Habitats for Waterbirds

Authors: Tamar Trop, Ido Izhaki

Abstract:

Small natural freshwater bodies (SNFWBs), which are vital for many waterbird species, are considered endangered habitats due to their progressive loss and extensive degradation. While SNFWBs are becoming extinct, studies have indicated that many waterbird species may greatly benefit from various types of small artificial waterbodies (SAWBs), such as floodwater and treated water reservoirs. If designed and managed with care, SAWBs hold significant potential to serve as alternative or complementary habitats for birds, and thus mitigate the adverse effects of SNFWBs loss. Currently, most reservoirs are built as infrastructural facilities and designed according to engineering best practices and site-specific considerations, which do not include catering for waterbirds' needs. Furthermore, as things stand, there is still a lack of clear and comprehensive knowledge regarding the additional factors that should be considered in tackling the challenge of attracting waterbirds' to reservoirs, without compromising on the reservoirs' original functions. This study attempts to narrow this knowledge gap by performing a systematic review of the various factors (e.g., bird attributes; physical, structural, spatial, climatic, chemical, and biological characteristics of the waterbody; and anthropogenic activities) affecting the occurrence, abundance, richness, and diversity of waterbirds in SNFWBs. The methodical review provides a concise and relatively unbiased synthesis of the knowledge in the field, which can inform decision-making and practice regarding the planning, design, and management of reservoirs with birds in mind. Such knowledge is especially beneficial for arid and semiarid areas, where natural water sources are deteriorating and becoming extinct even faster due to climate change.

Keywords: artificial waterbodies, reservoirs, small waterbodies, waterbirds

Procedia PDF Downloads 52
620 Integrated Intensity and Spatial Enhancement Technique for Color Images

Authors: Evan W. Krieger, Vijayan K. Asari, Saibabu Arigela

Abstract:

Video imagery captured for real-time security and surveillance applications is typically captured in complex lighting conditions. These less than ideal conditions can result in imagery that can have underexposed or overexposed regions. It is also typical that the video is too low in resolution for certain applications. The purpose of security and surveillance video is that we should be able to make accurate conclusions based on the images seen in the video. Therefore, if poor lighting and low resolution conditions occur in the captured video, the ability to make accurate conclusions based on the received information will be reduced. We propose a solution to this problem by using image preprocessing to improve these images before use in a particular application. The proposed algorithm will integrate an intensity enhancement algorithm with a super resolution technique. The intensity enhancement portion consists of a nonlinear inverse sign transformation and an adaptive contrast enhancement. The super resolution section is a single image super resolution technique is a Fourier phase feature based method that uses a machine learning approach with kernel regression. The proposed technique intelligently integrates these algorithms to be able to produce a high quality output while also being more efficient than the sequential use of these algorithms. This integration is accomplished by performing the proposed algorithm on the intensity image produced from the original color image. After enhancement and super resolution, a color restoration technique is employed to obtain an improved visibility color image.

Keywords: dynamic range compression, multi-level Fourier features, nonlinear enhancement, super resolution

Procedia PDF Downloads 538