Search results for: hash based key derivation
28409 Generation of Symmetric Key Using Randomness of Hash Function
Authors: Sai Charan Kamana, Harsha Vardhan Nakkina, B.R. Chandavarkar
Abstract:
In a highly secure and robust key generation process, a key role is played by randomness and random numbers when current real-world cryptosystems are observed. Most of the present-day cryptographic protocols depend upon the Random Number Generators (RNG), Pseudo-Random Number Generator (PRNG). These protocols often use noisy channels such as Disk seek time, CPU temperature, Mouse pointer movement, Fan noise to obtain true random values. Despite being cost-effective, these noisy channels may need additional hardware devices to continuously communicate with them. On the other hand, Hash functions are Pseudo-Random (because of their requirements). So, they are a good replacement for these noisy channels and have low hardware requirements. This paper discusses, some of the key generation methodologies, and their drawbacks. This paper explains how hash functions can be used in key generation, how to combine Key Derivation Functions with hash functions.Keywords: key derivation, hash based key derivation, password based key derivation, symmetric key derivation
Procedia PDF Downloads 16228408 Improve B-Tree Index’s Performance Using Lock-Free Hash Table
Authors: Zhanfeng Ma, Zhiping Xiong, Hu Yin, Zhengwei She, Aditya P. Gurajada, Tianlun Chen, Ying Li
Abstract:
Many RDBMS vendors use B-tree index to achieve high performance for point queries and range queries, and some of them also employ hash index to further enhance the performance as hash table is more efficient for point queries. However, there are extra overheads to maintain a separate hash index, for example, hash mapping for all data records must always be maintained, which results in more memory space consumption; locking, logging and other mechanisms are needed to guarantee ACID, which affects the concurrency and scalability of the system. To relieve the overheads, Hash Cached B-tree (HCB) index is proposed in this paper, which consists of a standard disk-based B-tree index and an additional in-memory lock-free hash table. Initially, only the B-tree index is constructed for all data records, the hash table is built on the fly based on runtime workload, only data records accessed by point queries are indexed using hash table, this helps reduce the memory footprint. Changes to hash table are done using compare-and-swap (CAS) without performing locking and logging, this helps improve the concurrency and avoid contention. The hash table is also optimized to be cache conscious. HCB index is implemented in SAP ASE database, compared with the standard B-tree index, early experiments and customer adoptions show significant performance improvement. This paper provides an overview of the design of HCB index and reports the experimental results.Keywords: B-tree, compare-and-swap, lock-free hash table, point queries, range queries, SAP ASE database
Procedia PDF Downloads 28828407 New Two-Way Map-Reduce Join Algorithm: Hash Semi Join
Authors: Marwa Hussein Mohamed, Mohamed Helmy Khafagy, Samah Ahmed Senbel
Abstract:
Map Reduce is a programming model used to handle and support massive data sets. Rapidly increasing in data size and big data are the most important issue today to make an analysis of this data. map reduce is used to analyze data and get more helpful information by using two simple functions map and reduce it's only written by the programmer, and it includes load balancing , fault tolerance and high scalability. The most important operation in data analysis are join, but map reduce is not directly support join. This paper explains two-way map-reduce join algorithm, semi-join and per split semi-join, and proposes new algorithm hash semi-join that used hash table to increase performance by eliminating unused records as early as possible and apply join using hash table rather than using map function to match join key with other data table in the second phase but using hash tables isn't affecting on memory size because we only save matched records from the second table only. Our experimental result shows that using a hash table with hash semi-join algorithm has higher performance than two other algorithms while increasing the data size from 10 million records to 500 million and running time are increased according to the size of joined records between two tables.Keywords: map reduce, hadoop, semi join, two way join
Procedia PDF Downloads 51428406 A Method of the Semantic on Image Auto-Annotation
Authors: Lin Huo, Xianwei Liu, Jingxiong Zhou
Abstract:
Recently, due to the existence of semantic gap between image visual features and human concepts, the semantic of image auto-annotation has become an important topic. Firstly, by extract low-level visual features of the image, and the corresponding Hash method, mapping the feature into the corresponding Hash coding, eventually, transformed that into a group of binary string and store it, image auto-annotation by search is a popular method, we can use it to design and implement a method of image semantic auto-annotation. Finally, Through the test based on the Corel image set, and the results show that, this method is effective.Keywords: image auto-annotation, color correlograms, Hash code, image retrieval
Procedia PDF Downloads 49728405 Improved Hash Value Based Stream CipherUsing Delayed Feedback with Carry Shift Register
Authors: K. K. Soundra Pandian, Bhupendra Gupta
Abstract:
In the modern era, as the application data’s are massive and complex, it needs to be secured from the adversary attack. In this context, a non-recursive key based integrated spritz stream cipher with the circulant hash function using delayed feedback with carry shift register (d-FCSR) is proposed in this paper. The novelty of this proposed stream cipher algorithm is to engender the improved keystream using d-FCSR. The proposed algorithm is coded using Verilog HDL to produce dynamic binary key stream and implemented on commercially available FPGA device Virtex 5 xc5vlx110t-2ff1136. The implementation of stream cipher using d-FCSR on the FPGA device operates at a maximum frequency of 60.62 MHz. It achieved the data throughput of 492 Mbps and improved in terms of efficiency (throughput/area) compared to existing techniques. This paper also briefs the cryptanalysis of proposed circulant hash value based spritz stream cipher using d-FCSR is against the adversary attack on a hardware platform for the hardware based cryptography applications.Keywords: cryptography, circulant function, field programmable gated array, hash value, spritz stream cipher
Procedia PDF Downloads 25328404 Weyl Type Theorem and the Fuglede Property
Authors: M. H. M. Rashid
Abstract:
Given H a Hilbert space and B(H) the algebra of bounded linear operator in H, let δAB denote the generalized derivation defined by A and B. The main objective of this article is to study Weyl type theorems for generalized derivation for (A,B) satisfying a couple of Fuglede.Keywords: Fuglede Property, Weyl’s theorem, generalized derivation, Aluthge transform
Procedia PDF Downloads 12828403 A Security Cloud Storage Scheme Based Accountable Key-Policy Attribute-Based Encryption without Key Escrow
Authors: Ming Lun Wang, Yan Wang, Ning Ruo Sun
Abstract:
With the development of cloud computing, more and more users start to utilize the cloud storage service. However, there exist some issues: 1) cloud server steals the shared data, 2) sharers collude with the cloud server to steal the shared data, 3) cloud server tampers the shared data, 4) sharers and key generation center (KGC) conspire to steal the shared data. In this paper, we use advanced encryption standard (AES), hash algorithms, and accountable key-policy attribute-based encryption without key escrow (WOKE-AKP-ABE) to build a security cloud storage scheme. Moreover, the data are encrypted to protect the privacy. We use hash algorithms to prevent the cloud server from tampering the data uploaded to the cloud. Analysis results show that this scheme can resist conspired attacks.Keywords: cloud storage security, sharing storage, attributes, Hash algorithm
Procedia PDF Downloads 39028402 Inner Derivations of Low-Dimensional Diassociative Algebras
Authors: M. A. Fiidow, Ahmad M. Alenezi
Abstract:
In this work, we study the inner derivations of diassociative algebras in dimension two and three, an algorithmic approach is adopted for the computation of inner derivation, using some results from the derivation of finite dimensional diassociative algebras. Some basic properties of inner derivation of finite dimensional diassociative algebras are also provided.Keywords: diassociative algebras, inner derivations, derivations, left and right operator
Procedia PDF Downloads 27028401 On Lie-Central Derivations and Almost Inner Lie-Derivations of Leibniz Algebras
Authors: Natalia Pacheco Rego
Abstract:
The Liezation functor is a map from the category of Leibniz algebras to the category of Lie algebras, which assigns a Leibniz algebra to the Lie algebra given by the quotient of the Leibniz algebra by the ideal spanned by the square elements of the Leibniz algebra. This functor is left adjoint to the inclusion functor that considers a Lie algebra as a Leibniz algebra. This environment fits in the framework of central extensions and commutators in semi-abelian categories with respect to a Birkhoff subcategory, where classical or absolute notions are relative to the abelianization functor. Classical properties of Leibniz algebras (properties relative to the abelianization functor) were adapted to the relative setting (with respect to the Liezation functor); in general, absolute properties have the corresponding relative ones, but not all absolute properties immediately hold in the relative case, so new requirements are needed. Following this line of research, it was conducted an analysis of central derivations of Leibniz algebras relative to the Liezation functor, called as Lie-derivations, and a characterization of Lie-stem Leibniz algebras by their Lie-central derivations was obtained. In this paper, we present an overview of these results, and we analyze some new properties concerning Lie-central derivations and almost inner Lie-derivations. Namely, a Leibniz algebra is a vector space equipped with a bilinear bracket operation satisfying the Leibniz identity. We define the Lie-bracket by [x, y]lie = [x, y] + [y, x] , for all x, y . The Lie-center of a Leibniz algebra is the two-sided ideal of elements that annihilate all the elements in the Leibniz algebra through the Lie-bracket. A Lie-derivation is a linear map which acts as a derivative with respect to the Lie-bracket. Obviously, usual derivations are Lie-derivations, but the converse is not true in general. A Lie-derivation is called a Lie-central derivation if its image is contained in the Lie-center. A Lie-derivation is called an almost inner Lie-derivation if the image of an element x is contained in the Lie-commutator of x and the Leibniz algebra. The main results we present in this talk refer to the conditions under which Lie-central derivation and almost inner Lie-derivations coincide.Keywords: almost inner Lie-derivation, Lie-center, Lie-central derivation, Lie-derivation
Procedia PDF Downloads 13728400 Derivation of BCK\BCI-Algebras
Authors: Tumadhir Fahim M Alsulami
Abstract:
The concept of this paper builds on connecting between two important notions, fuzzy ideals of BCK-algebras and derivation of BCI-algebras. The result we got is a new concept called derivation fuzzy ideals of BCI-algebras. Followed by various results and important theorems on different types of ideals. In chapter 1: We presented the basic and fundamental concepts of BCK\ BCI- algebras as follows: BCK/BCI-algebras, BCK sub-algebras, bounded BCK-algebras, positive implicative BCK-algebras, commutative BCK-algebras, implicative BCK- algebras. Moreover, we discussed ideals of BCK-algebras, positive implicative ideals, implicative ideals and commutative ideals. In the last section of chapter 1 we proposed the notion of derivation of BCI-algebras, regular derivation of BCI-algebras and basic definitions and properties. In chapter 2: It includes 3 sections as follows: Section 1 contains elementary concepts of fuzzy sets and fuzzy set operations. Section 2 shows O. G. Xi idea, where he applies fuzzy sets concept to BCK-algebras and we studied fuzzy sub-algebras as well. Section 3 contains fuzzy ideals of BCK-algebras basic definitions, closed fuzzy ideals, fuzzy commutative ideals, fuzzy positive implicative ideals, fuzzy implicative ideals, fuzzy H-ideals and fuzzy p-ideals. Moreover, we investigated their concepts in diverse theorems and propositions. In chapter 3: The main concept of our thesis on derivation fuzzy ideals of BCI- algebras is introduced. Chapter 3 splits into 4 sections. We start with General definitions and important theorems on derivation fuzzy ideal theory in section 1. Section 2 and 3 contain derivations fuzzy p-ideals and derivations fuzzy H-ideals of BCI- algebras, several important theorems and propositions were introduced. The last section studied derivations fuzzy implicative ideals of BCI-algebras and it includes new theorems and results. Furthermore, we presented a new theorem that associate derivations fuzzy implicative ideals, derivations fuzzy positive implicative ideals and derivations fuzzy commutative ideals. These concepts and the new results were obtained and introduced in chapter 3 were submitted in two separated articles and accepted for publication.Keywords: BCK, BCI, algebras, derivation
Procedia PDF Downloads 12428399 Empowering Certificate Management with Blockchain Technology
Authors: Yash Ambekar, Kapil Vhatkar, Prathamesh Swami, Kartikey Singh, Yashovardhan Kaware
Abstract:
The rise of online courses and certifications has created new opportunities for individuals to enhance their skills. However, this digital transformation has also given rise to coun- terfeit certificates. To address this multifaceted issue, we present a comprehensive certificate management system founded on blockchain technology and strengthened by smart contracts. Our system comprises three pivotal components: certificate generation, authenticity verification, and a user-centric digital locker for certificate storage. Blockchain technology underpins the entire system, ensuring the immutability and integrity of each certificate. The inclusion of a cryptographic hash for each certificate is a fundamental aspect of our design. Any alteration in the certificate’s data will yield a distinct hash, a powerful indicator of potential tampering. Furthermore, our system includes a secure digital locker based on cloud storage that empowers users to efficiently manage and access all their certificates in one place. Moreover, our project is committed to providing features for certificate revocation and updating, thereby enhancing the system’s flexibility and security. Hence, the blockchain and smart contract-based certificate management system offers a robust and one-stop solution to the escalating problem of counterfeit certificates in the digital era.Keywords: blockchain technology, smart contracts, counterfeit certificates, authenticity verification, cryptographic hash, digital locker
Procedia PDF Downloads 4728398 A Voice Signal Encryption Scheme Based on Chaotic Theory
Authors: Hailang Yang
Abstract:
To ensure the confidentiality and integrity of speech signals in communication transmission, this paper proposes a voice signal encryption scheme based on chaotic theory. Firstly, the scheme utilizes chaotic mapping to generate a key stream and then employs the key stream to perform bitwise exclusive OR (XOR) operations for encrypting the speech signal. Additionally, the scheme utilizes a chaotic hash function to generate a Message Authentication Code (MAC), which is appended to the encrypted data to verify the integrity of the data. Subsequently, we analyze the security performance and encryption efficiency of the scheme, comparing and optimizing it against existing solutions. Finally, experimental results demonstrate that the proposed scheme can resist common attacks, achieving high-quality encryption and speed.Keywords: chaotic theory, XOR encryption, chaotic hash function, Message Authentication Code (MAC)
Procedia PDF Downloads 5228397 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools
Abstract:
Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.Keywords: block matching, digital evidence, hash list, evaluation of digital evidence
Procedia PDF Downloads 25528396 Comparison of Authentication Methods in Internet of Things Technology
Authors: Hafizah Che Hasan, Fateen Nazwa Yusof, Maslina Daud
Abstract:
Internet of Things (IoT) is a powerful industry system, which end-devices are interconnected and automated, allowing the devices to analyze data and execute actions based on the analysis. The IoT technology leverages the technology of Radio-Frequency Identification (RFID) and Wireless Sensor Network (WSN), including mobile and sensor. These technologies contribute to the evolution of IoT. However, due to more devices are connected each other in the Internet, and data from various sources exchanged between things, confidentiality of the data becomes a major concern. This paper focuses on one of the major challenges in IoT; authentication, in order to preserve data integrity and confidentiality are in place. A few solutions are reviewed based on papers from the last few years. One of the proposed solutions is securing the communication between IoT devices and cloud servers with Elliptic Curve Cryptograhpy (ECC) based mutual authentication protocol. This solution focuses on Hyper Text Transfer Protocol (HTTP) cookies as security parameter. Next proposed solution is using keyed-hash scheme protocol to enable IoT devices to authenticate each other without the presence of a central control server. Another proposed solution uses Physical Unclonable Function (PUF) based mutual authentication protocol. It emphasizes on tamper resistant and resource-efficient technology, which equals a 3-way handshake security protocol.Keywords: Internet of Things (IoT), authentication, PUF ECC, keyed-hash scheme protocol
Procedia PDF Downloads 26528395 Reliability Qualification Test Plan Derivation Method for Weibull Distributed Products
Authors: Ping Jiang, Yunyan Xing, Dian Zhang, Bo Guo
Abstract:
The reliability qualification test (RQT) is widely used in product development to qualify whether the product meets predetermined reliability requirements, which are mainly described in terms of reliability indices, for example, MTBF (Mean Time Between Failures). It is widely exercised in product development. In engineering practices, RQT plans are mandatorily referred to standards, such as MIL-STD-781 or GJB899A-2009. But these conventional RQT plans in standards are not preferred, as the test plans often require long test times or have high risks for both producer and consumer due to the fact that the methods in the standards only use the test data of the product itself. And the standards usually assume that the product is exponentially distributed, which is not suitable for a complex product other than electronics. So it is desirable to develop an RQT plan derivation method that safely shortens test time while keeping the two risks under control. To meet this end, for the product whose lifetime follows Weibull distribution, an RQT plan derivation method is developed. The merit of the method is that expert judgment is taken into account. This is implemented by applying the Bayesian method, which translates the expert judgment into prior information on product reliability. Then producer’s risk and the consumer’s risk are calculated accordingly. The procedures to derive RQT plans are also proposed in this paper. As extra information and expert judgment are added to the derivation, the derived test plans have the potential to shorten the required test time and have satisfactory low risks for both producer and consumer, compared with conventional test plans. A case study is provided to prove that when using expert judgment in deriving product test plans, the proposed method is capable of finding ideal test plans that not only reduce the two risks but also shorten the required test time as well.Keywords: expert judgment, reliability qualification test, test plan derivation, producer’s risk, consumer’s risk
Procedia PDF Downloads 14128394 Expert System: Debugging Using MD5 Process Firewall
Authors: C. U. Om Kumar, S. Kishore, A. Geetha
Abstract:
An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization
Procedia PDF Downloads 42828393 Derivation of Neutrino Mass Parameters from the Study of Neutrinoless Double Beta Decay
Authors: Sabin Stoica
Abstract:
In this paper the theoretical challenges in the study of neutrinoless double beta decay are reviewed. Then, new upper limits of the neutrino mass parameters in the case of three isotopes are derived; 48Ca, 76Ge, and 82Se, assuming two possible mechanisms of occurrence of this nuclear process, namely the exchange of i) light left-handed neutrinos and ii) heavy right-handed neutrinos, between two nucleons inside the nucleus. The derivation is based on accurate calculations of the phase space factors and nuclear matrix elements performed with new high-performance computer codes, which are described in more detail in recent publications. These results are useful both for a better understanding of the scale of neutrino absolute mass and for the planning of future double beta decay experiments.Keywords: double beta decay, neutrino properties, nuclear matrix elements, phase space factors
Procedia PDF Downloads 60128392 Derivation of Technology Element for Automation in Table Formwork in a Tall Building Construction
Authors: Junehyuck Lee, Dongmin Lee, Hunhee Cho, Kyung-In Kang
Abstract:
A table formwork method has recently been widely applied in reinforced concrete structures in a tall building construction to improve safety and productivity. However, this method still depended mainly on manpower. Therefore, this study aimed at derivation of technology element to apply the automation in table formwork in a tall building construction. These results will contribute to improve productivity and labor saving in table formwork in tall building construction.Keywords: table form, tall building, automation, productivity
Procedia PDF Downloads 40128391 Message Authentication Scheme for Vehicular Ad-Hoc Networks under Sparse RSUs Environment
Authors: Wen Shyong Hsieh, Chih Hsueh Lin
Abstract:
In this paper, we combine the concepts of chameleon hash function (CHF) and identification based cryptography (IBC) to build a message authentication environment for VANET under sparse RSUs. Based on the CHF, TA keeps two common secrets that will be embedded to all identities to be as the evidence of mutual trusting. TA will issue one original identity to every RSU and vehicle. An identity contains one public ID and one private key. The public ID, includes three components: pseudonym, random key, and public key, is used to present one entity and can be verified to be a legal one. The private key is used to claim the ownership of the public ID. Based on the concept of IBC, without any negotiating process, a CHF pairing key multiplied by one private key and other’s public key will be used for mutually trusting and to be utilized as the session key of secure communicating between RSUs and vehicles. To help the vehicles to do message authenticating, the RSUs are assigned to response the vehicle’s temple identity request using two short time secretes that are broadcasted by TA. To light the loading of request information, one day is divided into M time slots. At every time slot, TA will broadcast two short time secretes to all valid RSUs for that time slot. Any RSU can response the temple identity request from legal vehicles. With the collected announcement of public IDs from the neighbor vehicles, a vehicle can set up its neighboring set, which includes the information about the neighbor vehicle’s temple public ID and temple CHF pairing key that can be derived by the private key and neighbor’s public key and will be used to do message authenticating or secure communicating without the help of RSU.Keywords: Internet of Vehicles (IOV), Vehicular Ad-hoc Networks (VANETs), Chameleon Hash Function (CHF), message authentication
Procedia PDF Downloads 39228390 The Comparison between bFGF and Small Molecules in Derivation of Chicken Primordial Germ Cells and Embryonic Germ Cells
Authors: Maryam Farzaneh, Seyyedeh Nafiseh Hassani, Hossein Baharvand
Abstract:
Objective: Chicken gonadal tissue has a two population such primordial germ cells (PGCs) and stromal cells (somatic cells). PGCs and embryonic germ cells (EGCs) that is a pluripotent type of PGCs in long-term culture are suitable sources for the production of chicken pluripotent stem cell lines, transgenic birds, vaccine and recombinant protein production. In general, the effect of growth factors such bFGF and mouse LIF on derivation of PGCs in vitro are important and in this study we could see the unique effect of small molecules such PD032 and SB43 as a chemical, in comparison to growth factors. Materials and Methods: After incubation of fertilized chicken egg up to 6 days and isolation of primary gonadal tissues and culture of mixed cells like PGCs and stromal cells. PGCs proliferate in the present of fetal calf serum (FCS) and small molecules and in another group bFGF, that these factors are important for PGCs culture and derivation. Somatic cells produce a multilayer feeder under the PGCs in primary culture and PGCs make a small cluster under these cells. Results: In present of small molecules and high volume of FCS (15%), the present of EGCs as a pluripotent stem cells were clear four weeks, that they had a positive immune-staining and periodic acid-Schiff staining (PAS), but in present of growth factors like bFGF without any chemicals, the present of PGCs were clear but after 7 until 10 days, there were disappear. Conclusion: Until now we have seen many researches about derivation and maintenance of chicken PGCs, in the hope of understanding the mechanisms that occur during germline development and production of a therapeutic product by transgenic birds. There are still many unknowns in this area and this project will try to have efficient conditions for identification of suitable culture medium for long-term culture of PGCs in vitro without serum and feeder cells.Keywords: chicken gonadal primordial germ cells, pluripotent stem cells, growth factors, small molecules, transgenic birds
Procedia PDF Downloads 43728389 Effects of Bacterial Inoculants and Enzymes Inoculation on the Fermentation and Aerobic Stability of Potato Hash Silage
Authors: B. D. Nkosi, T. F. Mutavhatsindi, J. J. Baloyi, R. Meeske, T. M. Langa, I. M. M. Malebana, M. D. Motiang
Abstract:
Potato hash (PH), a by-product from food production industry, contains 188.4 g dry matter (DM)/kg and 3.4 g water soluble carbohydrate (WSC)/kg DM, and was mixed with wheat bran (70:30 as is basis) to provide 352 g DM/kg and 315 g WSC/kg DM. The materials were ensiled with or without silage additives in 1.5L anaerobic jars (3 jars/treatment) that were kept at 25-280 C for 3 months. Four types of silages were produced which were: control (no additive, denoted as T1), celluclast enzyme (denoted as T2), emsilage bacterial inoculant (denoted as T3) and silosolve bacterial inoculant (denoted as T4). Three jars per treatment were opened after 3 months of ensiling for the determination of nutritive values, fermentation characteristics and aerobic stability. Aerobic stability was done by exposing silage samples to air for 5 days. The addition of enzyme (T2) was reduced (P<0.05) silage pH, fiber fractions (NDF and ADF) while increasing (P < 0.05) residual WSC and lactic acid (LA) production, compared to other treatments. Silage produced had pH of < 4.0, indications of well-preserved silage. Bacterial inoculation (T3 and T4) improved (P < 0.05) aerobic stability of the silage, as indicated by increased number of hours and lower CO2 production, compared to other treatments. However, the aerobic stability of silage was worsen (P < 0.05) with the addition of an enzyme (T2). Further work to elucidate these effects on nutrient digestion and growth performance on ruminants fed the silage is needed.Keywords: by-products, digestibility, feeds, inoculation, ruminants, silage
Procedia PDF Downloads 44028388 Derivation of Trigonometric Identities and Solutions through Baudhayan Numbers
Authors: Rakesh Bhatia
Abstract:
Students often face significant challenges in understanding and applying trigonometric identities, primarily due to the overwhelming need to memorize numerous formulas. This often leads to confusion, frustration, and difficulty in effectively using these formulas across diverse types of problems. Traditional methods of learning trigonometry demand considerable time and effort, which can further hinder comprehension and application. Vedic Mathematics offers an innovative and simplified approach to overcoming these challenges. This paper explores how Baudhayan Numbers, can be used to derive trigonometric identities and simplify calculations related to height and distance. Unlike conventional approaches, this method minimizes the need for extensive paper-based calculations, promoting a conceptual understanding. Using Vedic Mathematics Sutras such as Anurupyena and Vilokanam, this approach enables the derivation of over 100 trigonometric identities through a single, unified approach. The simplicity and efficiency of this technique not only make learning trigonometry more accessible but also foster computational thinking. Beyond academics, the practical applications of this method extend to engineering fields such as bridge design and construction, where precise trigonometric calculations are critical. This exploration underscores the potential of Vedic Mathematics to revolutionize the learning and application of trigonometry by offering a streamlined, intuitive, and versatile framework.Keywords: baudhayan numbers, anurupyena, vilokanam, sutras
Procedia PDF Downloads 1328387 Derivation of a Risk-Based Level of Service Index for Surface Street Network Using Reliability Analysis
Authors: Chang-Jen Lan
Abstract:
Current Level of Service (LOS) index adopted in Highway Capacity Manual (HCM) for signalized intersections on surface streets is based on the intersection average delay. The delay thresholds for defining LOS grades are subjective and is unrelated to critical traffic condition. For example, an intersection delay of 80 sec per vehicle for failing LOS grade F does not necessarily correspond to the intersection capacity. Also, a specific measure of average delay may result from delay minimization, delay equality, or other meaningful optimization criteria. To that end, a reliability version of the intersection critical degree of saturation (v/c) as the LOS index is introduced. Traditionally, the level of saturation at a signalized intersection is defined as the ratio of critical volume sum (per lane) to the average saturation flow (per lane) during all available effective green time within a cycle. The critical sum is the sum of the maximal conflicting movement-pair volumes in northbound-southbound and eastbound/westbound right of ways. In this study, both movement volume and saturation flow are assumed log-normal distributions. Because, when the conditions of central limit theorem obtain, multiplication of the independent, positive random variables tends to result in a log-normal distributed outcome in the limit, the critical degree of saturation is expected to be a log-normal distribution as well. Derivation of the risk index predictive limits is complex due to the maximum and absolute value operators, as well as the ratio of random variables. A fairly accurate functional form for the predictive limit at a user-specified significant level is yielded. The predictive limit is then compared with the designated LOS thresholds for the intersection critical degree of saturation (denoted as XKeywords: reliability analysis, level of service, intersection critical degree of saturation, risk based index
Procedia PDF Downloads 13128386 A Methodology for the Synthesis of Multi-Processors
Authors: Hamid Yasinian
Abstract:
Random epistemologies and hash tables have garnered minimal interest from both security experts and experts in the last several years. In fact, few information theorists would disagree with the evaluation of expert systems. In our research, we discover how flip-flop gates can be applied to the study of superpages. Though such a hypothesis at first glance seems perverse, it is derived from known results.Keywords: synthesis, multi-processors, interactive model, moor’s law
Procedia PDF Downloads 43728385 Gas Flow, Time, Distance Dynamic Modelling
Authors: A. Abdul-Ameer
Abstract:
The equations governing the distance, pressure- volume flow relationships for the pipeline transportation of gaseous mixtures, are considered. A derivation based on differential calculus, for an element of this system model, is addressed. Solutions, yielding the input- output response following pressure changes, are reviewed. The technical problems associated with these analytical results are identified. Procedures resolving these difficulties providing thereby an attractive, simple, analysis route are outlined. Computed responses, validating thereby calculated predictions, are presented.Keywords: pressure, distance, flow, dissipation, models
Procedia PDF Downloads 47528384 Establishment of a Nomogram Prediction Model for Postpartum Hemorrhage during Vaginal Delivery
Authors: Yinglisong, Jingge Chen, Jingxuan Chen, Yan Wang, Hui Huang, Jing Zhnag, Qianqian Zhang, Zhenzhen Zhang, Ji Zhang
Abstract:
Purpose: The study aims to establish a nomogram prediction model for postpartum hemorrhage (PPH) in vaginal delivery. Patients and Methods: Clinical data were retrospectively collected from vaginal delivery patients admitted to a hospital in Zhengzhou, China, from June 1, 2022 - October 31, 2022. Univariate and multivariate logistic regression were used to filter out independent risk factors. A nomogram model was established for PPH in vaginal delivery based on the risk factors coefficient. Bootstrapping was used for internal validation. To assess discrimination and calibration, receiver operator characteristics (ROC) and calibration curves were generated in the derivation and validation groups. Results: A total of 1340 cases of vaginal delivery were enrolled, with 81 (6.04%) having PPH. Logistic regression indicated that history of uterine surgery, induction of labor, duration of first labor, neonatal weight, WBC value (during the first stage of labor), and cervical lacerations were all independent risk factors of hemorrhage (P <0.05). The area-under-curve (AUC) of ROC curves of the derivation group and the validation group were 0.817 and 0.821, respectively, indicating good discrimination. Two calibration curves showed that nomogram prediction and practical results were highly consistent (P = 0.105, P = 0.113). Conclusion: The developed individualized risk prediction nomogram model can assist midwives in recognizing and diagnosing high-risk groups of PPH and initiating early warning to reduce PPH incidence.Keywords: vaginal delivery, postpartum hemorrhage, risk factor, nomogram
Procedia PDF Downloads 7928383 Modification of Newton Method in Two Points Block Differentiation Formula
Authors: Khairil Iskandar Othman, Nadhirah Kamal, Zarina Bibi Ibrahim
Abstract:
Block methods for solving stiff systems of ordinary differential equations (ODEs) are based on backward differential formulas (BDF) with PE(CE)2 and Newton method. In this paper, we introduce Modified Newton as a new strategy to get more efficient result. The derivation of BBDF using modified block Newton method is presented. This new block method with predictor-corrector gives more accurate result when compared to the existing BBDF.Keywords: modified Newton, stiff, BBDF, Jacobian matrix
Procedia PDF Downloads 38028382 Integral Image-Based Differential Filters
Authors: Kohei Inoue, Kenji Hara, Kiichi Urahama
Abstract:
We describe a relationship between integral images and differential images. First, we derive a simple difference filter from conventional integral image. In the derivation, we show that an integral image and the corresponding differential image are related to each other by simultaneous linear equations, where the numbers of unknowns and equations are the same, and therefore, we can execute the integration and differentiation by solving the simultaneous equations. We applied the relationship to an image fusion problem, and experimentally verified the effectiveness of the proposed method.Keywords: integral images, differential images, differential filters, image fusion
Procedia PDF Downloads 50828381 A Privacy Protection Scheme Supporting Fuzzy Search for NDN Routing Cache Data Name
Authors: Feng Tao, Ma Jing, Guo Xian, Wang Jing
Abstract:
Named Data Networking (NDN) replaces IP address of traditional network with data name, and adopts dynamic cache mechanism. In the existing mechanism, however, only one-to-one search can be achieved because every data has a unique name corresponding to it. There is a certain mapping relationship between data content and data name, so if the data name is intercepted by an adversary, the privacy of the data content and user’s interest can hardly be guaranteed. In order to solve this problem, this paper proposes a one-to-many fuzzy search scheme based on order-preserving encryption to reduce the query overhead by optimizing the caching strategy. In this scheme, we use hash value to ensure the user’s query safe from each node in the process of search, so does the privacy of the requiring data content.Keywords: NDN, order-preserving encryption, fuzzy search, privacy
Procedia PDF Downloads 48728380 Derivation of Bathymetry Data Using Worldview-2 Multispectral Images in Shallow, Turbid and Saline Lake Acıgöl
Authors: Muhittin Karaman, Murat Budakoglu
Abstract:
In this study, derivation of lake bathymetry was evaluated using the high resolution Worldview-2 multispectral images in the very shallow hypersaline Lake Acıgöl which does not have a stable water table due to the wet-dry season changes and industrial usage. Every year, a great part of the lake water budget has been consumed for the industrial salt production in the evaporation ponds, which are generally located on the south and north shores of Lake Acıgöl. Therefore, determination of the water level changes from a perspective of remote sensing-based lake water by bathymetry studies has a great importance in the sustainability-control of the lake. While the water table interval is around 1 meter between dry and wet season, dissolved ion concentration, salinity and turbidity also show clear differences during these two distinct seasonal periods. At the same time, with the satellite data acquisition (June 9, 2013), a field study was conducted to collect the salinity values, Secchi disk depths and turbidity levels. Max depth, Secchi disk depth and salinity were determined as 1,7 m, 0,9 m and 43,11 ppt, respectively. Eight-band Worldview-2 image was corrected for atmospheric effects by ATCOR technique. For each sampling point in the image, mean reflectance values in 1*1, 3*3, 5*5, 7*7, 9*9, 11*11, 13*13, 15*15, 17*17, 19*19, 21*21, 51*51 pixel reflectance neighborhoods were calculated separately. A unique image has been derivated for each matrix resolution. Spectral values and depth relation were evaluated for these distinct resolution images. Correlation coefficients were determined for the 1x1 matrix: 0,98, 0,96, 0,95 and 0,90 for the 724 nm, 831 nm, 908 nm and 659 nm, respectively. While 15x5 matrix characteristics with 0,98, 0,97 and 0,97 correlation values for the 724 nm, 908 nm and 831 nm, respectively; 51x51 matrix shows 0,98, 0,97 and 0,96 correlation values for the 724 nm, 831 nm and 659 nm, respectively. Comparison of all matrix resolutions indicates that RedEdge band (724 nm) of the Worldview-2 satellite image has the best correlation with the saline shallow lake of Acıgöl in-situ depth.Keywords: bathymetry, Worldview-2 satellite image, ATCOR technique, Lake Acıgöl, Denizli, Turkey
Procedia PDF Downloads 447