| Literature DB >> 35265113 |
Tianxin Hua1, Lingling Zhang1.
Abstract
With the rapid development of computer network technology, the concept of "Internet +" has become more and more popular in recent years. The combination of the Internet and finance has particularly attracted people's attention, and the operating modes of many industries have also changed. Since the use of Internet technology can achieve data sharing and information exchange, the "Internet + Finance" model has broken the barriers of information asymmetry in the financial sector in the past and has made great contributions to China's multiple improvements. The financial market is very important to China's economic development. The identification of the ID function of the wireless sensor network is susceptible to interference and the identification accuracy is reduced. We propose an adaptive identification feature recognition algorithm based on an improved minimum gray tree. After calculating the similarity, the nearest neighbor matching algorithm is directly used to obtain the minimum matching cost corresponding to the wireless sensor network registration that is regarded as the recognized identity so as to realize the identity function adaptive recognition. In this regard, the simulation results show that the proposed algorithm has high recognition accuracy. With the pace of financial innovation, financial institutions have achieved rapid development on the basis of Internet service platforms. At the same time, as the core of preventing money laundering activities, financial institutions are also facing many issues in identifying "customers" in their work. This article analyzes the main content, implementation effects, and difficulty of customer identification in financial institutions and proposes relevant improvement plans.Entities:
Mesh:
Year: 2022 PMID: 35265113 PMCID: PMC8901305 DOI: 10.1155/2022/6975347
Source DB: PubMed Journal: Comput Intell Neurosci
Figure 1High-end biometric technology using a combination of voice and skull fingerprints.
Comparison of various biometric technology indicators.
| Biometric technology | Convenience | Accuracy | Security level | Stability | Identify equipment cost |
|---|---|---|---|---|---|
| Fingerprint recognition | Higher | High | Medium | Higher | Medium |
| Face recognition | Extremely high | High | High | Higher | Medium |
| Speech recognition | High | Medium | Higher | Medium | Lower |
| Behavior feature recognition | Higher | Higher | Higher | Higher | Medium |
Geometric features of standard human faces.
| Feature | Geometric characteristics | Measurements |
|---|---|---|
| F1 | Inside and outside eye point width/cm | 3.9 |
| F2 | Distance between inner eye points/cm | 4.2 |
| F3 | Outer eye point interval/cm | 10.3 |
| F4 | Height of nose/cm | 1.4 |
| F5 | Nose tip angle of left and right wings | 116° |
| F6 | Eye point and nose angle in left and right | 32° |
| F7 | Nose volume/cm3 | 7.9 |
Acquisition of geometric characteristics of the 3D face model.
| Feature | Smile | Frustrated | Pouting | Bulging |
|---|---|---|---|---|
| △F1/cm | 0.9 | 1.5 | 0.2 | 0.2 |
| △F2/cm | 0.5 | 1.8 | 0.3 | 0.3 |
| △F3/cm | 3.5 | 2.6 | 0.4 | 0.3 |
| △F4/cm | 0.6 | 0.3 | 0 | 0 |
| △F5 | 6° | 5° | 0 | 0 |
| △F6 | 4° | 10° | 2° | 4° |
| △F7 | 25° | 8° | 6° | 5° |
| △F8 | 45° | 26° | 25° | 25° |
| △F9/cm3 | 4 | 2.6 | 0 | 0 |
| Geometric similarity | 0.58 | 0.45 | 0.60 | 0.55 |
| Relevance similarity | 0.64 | 0.71 | 0.63 | 0.59 |
Some features used in gait recognition.
| Feature | Description |
|---|---|
| Root mean square | The sum of squares in the three directions of |
| Energy | Work done in the three directions of |
| Percentile value | The percentage of the number of samples below this sample value in the dataset to the total number of samples |
| Mean absolute deviation | The average of the absolute value of the deviation of each value from its arithmetic mean |
| Cadence | The number of complete gait cycles contained in 1 minute |
Figure 2Flow chart of sensor data synchronization process.
Detailed comparison of the performance of the two algorithms.
| Variable type | The recognition accuracy of the algorithm in this paper (%) | Recognition accuracy based on palmprint and facial feature algorithm (%) |
|---|---|---|
| Happy | 100 | 84 |
| Sad | 97 | 76 |
| Surprised | 96 | 62 |
| Light source 1 | 98 | 76 |
| Light source 2 | 96 | 69 |
| Light source 3 | 90 | 63 |
Figure 3Performance graph of the ShakeLogin dataset with different division ratios.
Figure 4HHAR dataset (all behaviors).
User identification results of ShakeLogin and HHAR datasets.
| ShakeLogin (17 users, shake the phone arbitrarily) | Random 80% of the entire data set/remaining | 85.46 | 87.25 | 92.07 | 81.98 | 75.33 |
|---|---|---|---|---|---|---|
| HHAR (9 users, 6 behaviors) | All actions/all actions | 92.56 | 97.12 | 92.56 | 97.23 | 94.88 |
| All behaviors/walking | 90.08 | 96.87 | 97.75 | 96.85 | 100.0 | |
| All actions/stations | 90.37 | 97.78 | 95.84 | 98.05 | 72.42 | |
| All behaviors/sit | 86.18 | 90.55 | 92.67 | 93.83 | 66.05 | |
| All behaviors/biking | 90.92 | 85.32 | 91.62 | 88.83 | 86.37 | |
| All actions/stairs | 92.75 | 96.93 | 91.35 | 95.16 | 97.59 | |
| All actions/down stairs | 88.73 | 93.18 | 95.59 | 94.39 | 97.56 |
Figure 5SSUI's performance in the ShakeLogin dataset varies with sample duration.